8:00 AM–9:00 AM - Compute Cloud Performance Showdown: Oracle, AWS, IBM, Google, Azure - @Ahmed_Aboulnaga
I really liked the subject matter, but really, really liked how Ahmed presented his findings by mainly focusing on the data results, and how he tried to make sure all VMs from the different providers were the same before he started. One caveat, by way of story, is that Netflix found out that all AWS VMs do not perform the same, so they will build thousands at a time, and then test them, destroying those VMs which do not perform up to expectations; Ahmed's point here was that he did not have the time or resources to do that, so was assuming each vendor would provide a VM that would perform the best they could, and he was not taking dedicated hardware options either as that defeats his purpose of testing actual VMs. In general, he found that AWS has a slight edge over the competition (mostly due to their newer and higher end CPU model), and in most testing Azure was the lowest performer, and Oracle Cloud was roughly half the price of the other options.
9:15 AM–10:15 AM - Enabling Canary Testing with Edition-Based Redefinition - @mohio73
With this presentation, Michael spent a lot of time explaining what canary testing and EBR were, before he did a very extensive demo to show how sessions connecting to the database to the ORA$BASE version would get redirected to the new VERSION1 edition after setting up triggers. I really did like this, and the demo was great, but I would have liked a little more about the canary release strategy. Visit https://www.github.com/mfhaynes/canary_ebr to get his code for the demo and see what he was doing!
10:30 AM-11:30 AM - Automate the operation of your Oracle Cloud infrastructure - @ncalerouy
This was a lot of screenshots and information, but there were a few tidbits here and there which I found useful like making sure you look for the new version of OCI documentation on docs.cloud.oracle.com as any OCI documents on docs.oracle.com actually refer to OCI CLASSIC and now v2. If you want to work with APIs, there are a few links to check out on docs.cloud.oracle.com/iaas/api/ and docs.cloud.oracle.com/iaas/Content/API/Concepts/devopstools.htm and when you are spinning up instances you have to make sure port 22 is open so read MOS Note 2433870.1 for all the details.
Lunch again! Beginning to be a habit here, but I had to make it a short lunch since I wanted to visit a session at noon, and the flat iron steak was pretty good even in a hurry.
12:00 PM-12:30 PM - All you need to know about Backup and Recovery - @fcomunoz
Did you know that Francisco has built a free tool called CrashSimulator that will do just that? I am looking forward to combining this tool, with the vagrant Oracle database from Blaine's blog post, and blowing up the database again, and again, and again!! Then we saw Oracle Backup Cloud Service which keeps your data secure, with the keys on site and not in the cloud, and no ACO licenses needed to compress your data. Oh, and did you know that data pumps and RMAN Backup/Recovery options are actually in Oracle SQL Developer?!? I had no idea!
12:45 PM–1:45 PM - Data Analysis SQL vs Pandas Python Package _ - Rama Koganti
Python is about the speed of code vs. speed of development, and an example of this is how Google was doing video before buying YouTube, but Google had around 1,000 developers using C/C++ and YouTube had around 20 developers using Python to get features into the marketplace quicker and that is why Google decided to stop competing, and just buy them up. Very interesting! Lots of this presentation was demo, but check out his repository at https://www.github.com/ramak919 to get up to date on the panda Python package which turns tables into data frames, and supports a LOT of data sources to get your data from! Again, another tool to combine with something else I learned at the conference since I can use Python to connect via the cx_Oracle package and then call the panda package in order to get my data out, then put it into a data frame, and do analysis or convert it, etc.
2:00 PM-3:00 PM - Devops Tools for Database Developers - @OraBlaineOS
Key in this is how DevOps is more about culture than the tools, and how you interact with your TEAMS. Again there is a GitHub repository available https://www.github.com/oracle/dino-date he used combined with Oracle Application Container Cloud Service to show the application Liquibase which is version control for your DB, change sets, auto generate SQL, tracks changes, rollback, conditions, diffs / reverse engineer, docs and the application utPLSQL which is a unit testing framework for PL/SQL.
4:30 PM-5:30 PM - Introducting Oracle Database 19c - @SeanStaceyfwiw
Sean is an Oracle PM for the Database product, so he is one of the few authoritative voices on the topic, so I was excited to dip my toes back into my Oracle roots with this session. The 19c product is a "long term release" which you can see as 18c support appears to end around 2021, but 19c support ends somewhere beyond 2026, and the core aim for the release is stability with thousands of bugs fixed and thousands of human years were spent in automated regression testing. There are a lot of new features you need to look up like Stats Only Queries, SQL Quarantine, High Frequency Automatic Statistics Collection, Real Time Statistics Collection, Automatic Indexing, and Partitioned Hybrid Tables! How best to find out? Go to https://apex.oracle.com/database-features/ then click on 19c, click on New Features Only, and then pick your new feature of choice to see the documentation. Oh, and want to know something neat? Try running this on a 19c database: select json_object(*) from customers and see what you get.
As per my opinion, videos play a vital role in learning. And when you consider deep knowledge as a service, then you should focus on all the learning methods. Udacity seems to be an excellent place to exploremachine learning.
ReplyDeleteVery useful and informative blog. Thank you so much for these kinds of informative blogs.
ReplyDeletewho provides seo services and e-commerce development services.
website designer in noida
website designers delhi
website designers in delhi
website designing agency in delhi
website designing and development
website designing companies in delhi
website designing company delhi
website designing company in delhi ncr
website designing company in gurgaon
website designing company in new delhi
website designing company in noida
website designing company list
website designing company noida
website designing cost in delhi
website designing cost in india
website designing delhi
website designing firms in delhi
website designing in delhi
website designing in delhi ncr
website designing in gurgaon
website designing in noida
website designing services
website designing services delhi
website designing services in delhi
web design development company
web design development services
web design in delhi
web design service
web design services company
web design services in delhi
web designer company
web designer delhi
web designer in delhi
web designers delhi
web designers in delhi
web designing & development
web designing advertisement
web designing and development
web designing and development company
web designing and development services
As per my opinion, videos play a vital role in learning. And when you consider Automated big data engineering , then you should focus on all the learning methods. Udacity seems to be an excellent place to explore machine learning.
ReplyDeleteThank you. Helpful article. Nowadays in learning education videos are most helpful in learning any streams. Like i love programming in python language. it is helpful in machine learning solutions & i see videos of python programming to learning.
ReplyDelete