Companies need a data-processing solution that increases the speed of business agility, not one that is complicated by too many technology requirements. This requires a system that delivers continuous/real-time data-processing capabilities for the new business reality.
Stream processing is a hot topic right now, especially for any organization looking to provide insights faster. But what does it mean for users of Java applications, microservices, and in-memory computing?
In this webinar, we will cover the evolution of stream processing and in-memory related to big data technologies and why it is the logical next step for in-memory processing projects.
Setting up servers and configuring software can get in the way of the problems you are trying to solve. With Hazelcast Cloud we take all of those pain points away.
Watch this webinar to learn how you can instantly fire up and then work with Hazelcast Cloud from anywhere in the world. With our auto-generated client stubs for Java, Go, Node.js, Python and .NET, we can have you connected and coding in less than a minute!
Irish Revenue is the Irish tax and customs administration, employing approximately 5,968. It deals with almost 3.5 million personal and business taxpayers and collects in the order of €48 billion a year in taxes and duties. It is also responsible for trade facilitation and frontier control. The organization is highly decentralized, with ofﬁces in all parts of the country.
Following the introduction of a new property tax, the Irish Revenue’s IT department was tasked with launching a new service for homeowners who were required to declare their property liability and make a declaration online. It soon became apparent that their existing IT architecture wouldn’t be able to cope with two million property owners accessing the website – potentially at the last minute.
In addition, Irish Revenue wanted to change the way it managed back-ups, ﬁxes, and upgrades. Historically, these were conducted during the evenings which could result in some services being unavailable as night time copies were made and applications tweaked. Therefore, it required a solution which could perform operational tasks seamlessly in parallel with no external effect on service quality.
Crucially, Irish Revenue had two prime solution requirements – high availability and performance. Due to its desire to implement an open-source solution which could handle surges in trafﬁc and store data in-memory, Irish Revenue approached Hazelcast.
Since the implementation of Hazelcast IMDG, development and testing effort has been reduced, while quality has improved. Any concerns raised by Irish Revenue have been taken onboard by Hazelcast and enhancements included in the next product release. Deploying Hazelcast IMDG has provided greater availability, which in turn has changed the way Irish Revenue manages new releases, ﬁxes, and enhancements.
Previously done at the weekend, any application changes can now be conducted seamlessly during the week with no impact on service performance. Prior to deploying Hazelcast, the application architecture relied on one application server, so if there was an issue with one application it could result in the downtime of all applications. Now, Irish Revenue is able to isolate applications and conduct application changes on an individual basis.
Using Hazelcast IMDG has enabled Irish Revenue to be more agile and efﬁcient, empowering developers to introduce new customer innovations in the knowledge that there will be no impact on service quality. “With the introduction of the new architecture and the deployment of Hazelcast, testing and development effort has been reduced and overall quality has improved. As we are now storing data in-memory we are using more hardware resource, but the beneﬁts are clear to see. Hardware costs are cheaper than labor costs so, in many respects, it’s about ﬁnding the right balance.
Whether you're interested in learning the basics of in-memory systems, or you're looking for advanced, real-world production examples and best practices, we've got you covered.