Companies need a data-processing solution that increases the speed of business agility, not one that is complicated by too many technology requirements. This requires a system that delivers continuous/real-time data-processing capabilities for the new business reality.
Stream processing is a hot topic right now, especially for any organization looking to provide insights faster. But what does it mean for users of Java applications, microservices, and in-memory computing?
In this webinar, we will cover the evolution of stream processing and in-memory related to big data technologies and why it is the logical next step for in-memory processing projects.
Setting up servers and configuring software can get in the way of the problems you are trying to solve. With Hazelcast Cloud we take all of those pain points away.
Watch this webinar to learn how you can instantly fire up and then work with Hazelcast Cloud from anywhere in the world. With our auto-generated client stubs for Java, Go, Node.js, Python and .NET, we can have you connected and coding in less than a minute!
Get a 30-day free trial.
Get started today with the
industry’s leading in-memory computing platform.
The in-memory speed you count on, with the convenience and scalability of cloud.
This performance benchmark compares Redis Open Source 5.0.3 vs Hazelcast IMDG Enterprise 3.12. The purpose is to reproduce the low thread count benchmark conducted by Redis Labs in 2018, correcting for deficiencies in the Redis benchmark.
The things we changed were:
Hazelcast Enterprise 3.12 was slightly faster than Redis.
We would also draw your attention to our previous benchmarks which have consistently found Hazelcast faster than Redis:
3 members (1 member per machine)
<?xml version=”1.0” encoding=”UTF-8”?>
<port port-count=”200” auto-increment=”true”>5701</
<!-- Add the member IP addresses here -->
<native-memory allocator-type=”POOLED” enabled=”true”>
<size unit=”GIGABYTES” value=”30”/>
No additional configuration in redis.conf file. System adjusted as per Redis process output suggestions:
Hazelcast Java client
More comments on Redis clients on top of this page.
RadarGun 3.0.0 with modifications to support Redis.
Number of threads per client
Number of objects
Command line used
dist.sh -c <benchmark_xml_above> -t -m <master_ip>:2103 server1 server2
server3 client1 client2 client3
In order to provide an apples to apples comparison, we needed to run tests for both products using the same tool – RadarGun. However, RadarGun didn’t have capabilities for the setup that was presented by RedisLabs’ benchmark.
We implemented the support for all of the missing pieces, namely:
The source code of the modifications can be found in the following GitHub repository: https://github.com/hazelcast/radargun/tree/redis_benchmark. We hope to push these changes to the official RadarGun repository.
From the charts, we see that Redis with Lettuce client performed ~612 000 operations per second, whereas Hazelcast IMDG Enterprise did ~636 000 operations per second.