Poor Juan Ponce de Leon. The Spanish explorer spent years searching for the Fountain of Youth. Instead, all he discovered was Florida. Many in the database business have likewise spent years chasing a myth: a single database that can simultaneously support both high-performance, high-speed transactional processing and large-scale advanced analytics and data science. In both cases, the object of desire never existed in the first place. And In fact, the story of Juan’s search for the mythical fountain is itself a myth. But the search for a magic database is all too real.
The truth is, trade-offs are required for any single database or system to support both transactions and analytics. Either transaction performance takes a hit or the amount of data available for analysis must be limited. For some use cases, these trade-offs are acceptable. But for use cases when both high-performance transactions and big data analytics are required, the better approach is to seamlessly connect two best-of-breed solutions, such as Pivotal GemFire, a Java-based transactional in-memory data grid, and Pivotal Greenplum, a massively parallel processing analytical database. In this episode of Pivotal Insights, host Jeff Kelly speaks with Pivotal’s Ivan Novick and Jag Mirani about the new GemFire-Greenplum Connector that supports use cases that require both high-performance transactions and big data analytics such as fraud detection, risk management and customer recommendations.
Show Notes
- Visit http://pivotal.io/podcasts for show notes and other episodes.
- Download the episode and check us out on SoundCloud, subscribe to the feed directly, or on iTunes to have it automatically downloaded for you.
- Twitter: @jeffreyfkelly, @jagmirani
- Feedback: [email protected]