Sign in
Log inSign up

How to handle processing of a large set of data?

Roopak A N's photo
Roopak A N
·Jan 15, 2018

My current problem has approx 120 million objects in total. Every operation will use around 30-40 million objects from this superset based on some filters. We use Java, MySQL, Hibernate stack. While processing, we need all these objects in memory.

Please share the way you would handle such a problem. Thanks in Advance for the help.


Edit: Our solution is time critical. This is why we handle all operations in-memory.

Hassle-free blogging platform that developers and teams love.
  • Docs by Hashnode
    New
  • Blogs
  • AI Markdown Editor
  • GraphQL APIs
  • Open source Starter-kit

© Hashnode 2024 — LinearBytes Inc.

Privacy PolicyTermsCode of Conduct