This short video explains why companies use Hazelcast for business-critical applications based on ultra-fast in-memory and/or stream processing technologies.
Stream processing is a hot topic right now, especially for any organization looking to provide insights faster. But what does it mean for users of Java applications, microservices, and in-memory computing?
In this webinar, we will cover the evolution of stream processing and in-memory related to big data technologies and why it is the logical next step for in-memory processing projects.
Now, deploying Hazelcast-powered applications in a cloud-native way becomes even easier with the introduction of Hazelcast Cloud Enterprise, a fully-managed service built on the Enterprise edition of Hazelcast IMDG. Can't attend the live times? You should still register! We'll be sending out the recording after the webinar to all registrants.
Hazelcast is releasing a reference implementation that simplifies a financial service organization’s ability to execute and scale financial risk calculations in the cloud while gaining real-time performance and fully-utilizing the resource-heavy investment.
Hazelcast announced the expansion of its channel program to include several new partners that increase the company’s go-to-market capabilities in verticals and regions experiencing strong cloud adoption.
Hazelcast Jet brings new SQL query capabilities to the stream processing platform that will enable developers to continuously query streaming data without interrupting data flow.
No posts were found matching that criteria.
Hazelcast discusses how in-memory computing and AI are the key to up-selling at scale in the financial services sector.
Real-time streaming data has become a reality for an increasing number of enterprise applications.
San Mateo, Calif., October 7, 2020 – Hazelcast, the leading open source in-memory computing platform, today announced that IBM will offer the Hazelcast In-Memory Computing Platform (IMCP) integrated as part of IBM’s Cloud Pak offerings, enterprise-ready containerized software running on Red Hat OpenShift. IBM’s Cloud Paks, including the Cloud Pak for Multicloud Management, can now […]
In-memory computing platform maker has Hazelcast is adding new features and enhancements to its namesake Java-based in-memory data grid.
In-memory data grids (IMDGs) historically have exceled in applications that require the fastest processing times and the lowest latencies. By adding a stream processing engine, called Jet, to its IMDG, Hazelcast is finding customers exploring new use cases at the cutting edge of high-performance computing.
As all online business becomes more competitive – within the same industry and in the broader realm of customer experience in web and mobile apps – low latency will continue to be a priority. So as a business, how can you ensure the highest performance from your systems when moving to the cloud?
In this podcast, John DesJardins, field CTO and VP solution architecture at Hazelcast, sat down with InfoQ podcast co-host Daniel Bryant. Topics discussed included: how in-memory data grids have evolved, use cases at the edge (IoT, ML inference), integration of stream processing APIs and techniques, and how data grids can be used within application modernization.
Edge computing has enormous potential to transform how companies leverage data to produce value.
Hazelcast today announced a new major feature and a number of enhancements to its in-memory data grid (IMDG), Hazelcast IMDG.
There are no more posts.