Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Mastering Hibernate

You're reading from   Mastering Hibernate Learn how to correctly utilize the most popular Object-Relational Mapping tool for your Enterprise application

Arrow left icon
Product type Paperback
Published in May 2016
Publisher Packt
ISBN-13 9781782175339
Length 204 pages
Edition 1st Edition
Languages
Arrow right icon
Toc

Batch processing

When you interact with the session by saving entities or fetching them from the DB, Hibernate keeps them around in the persistent context until the session is closed, or until you evict the object or clear the session. This is Hibernate's first-level cache.

Care must be taken when executing queries that load many objects or when trying to save a large set of entities. If you don't perform some cleanup work, your JVM will run out of memory in the middle of the work unit.

There are certain things you can do to avoid such situations. Some of them are manual work, and others are managed by Hibernate if you provide enough hints or if you use the right session type.

Note

How does Hibernate know whether it should call JDBC executeBatch? This decision is made in the entity persister, which is responsible for persisting an entity via JDBC. Hibernate keeps track of all the DML statements for each entity type, and when the statement count is more than 1 for a particular entity type, it will use batch execution.

Manual batch management

One way to ensure that batch processing is under control is by manually managing the population of the first-level cache, that is, the persistent context. If you are saving or updating a batch of entities, you can occasionally flush and clear the session. Flushing the session will execute all the pending SQL statements, and when you clear the session, all the entities are evicted from the persistent context.

You can do this by forcing a flush and clear, as follows:

  public void saveStudents(List<Map<String, String>> students) {
    final int batchSize = 15;
    Session session = HibernateUtil
   .getSessionFactory()
   .getCurrentSession();
    Transaction transaction = session.beginTransaction();
    try {
      int i = 0;
      for (Map<String, String> studentMap:students) {
        i++;
        Student student = new Student();
        student.setFirstname(studentMap.get("firstname"));
        student.setLastname(studentMap.get("lastname"));
        session.save(student);
        
        if (i % batchSize == 0) {
          session.flush();
          session.clear();
        }
      }
      transaction.commit();
    }
    catch (Exception e) {
      transaction.rollback();
      // log stack trace
    }
    finally {
      if (session.isOpen())
        session.close();
    }
  }

You should use the same mechanism when you are fetching entities. There is a slight performance hit when you flush and clear the session. However, this is not significant. Your JDBC connection is still open, and the transaction is still active, and these are the expensive resources whose lifecycle you need to be concerned with in your design. (Refer to the earlier discussion on contextual session.)

Setting batch size

When you flush the session, you are essentially submitting the appropriate SQL statements to the JDBC layer. In the JDBC world, you can either execute a single statement, or you can batch statements and when ready, execute the batch (refer to the java.sql.Statement.addBatch(…) and executeBatch() methods).

There is no batch size in JDBC, but Hibernate uses the property called jdbc.batch_size to control how many entities will be in a batch. This doesn't mean that if you set this, you don't have to worry about memory exhaustion; you still have to manually manage the persistent context for a large sized batch. This just means that when Hibernate determines that it can batch DML statements, how many times does it call addBatch(…) before calling executeBatch().

There is another batch size setting, which comes in the form of annotation, and this is @BatchSize, which is used to decorate an entity class. This setting is not for batch inserts or updates; this is used at fetch time for collections and entities when they are loaded lazily.

Using stateless session

Stateless session was introduced earlier. As there is no persistent context in a stateless session, you don't need to flush or clear the session. All the changes to your entities are reflected immediately in the database as there is no delayed write. Remember, there is no cascade operation on associated entities, and the associated collections are ignored. So, you have to manually manage all the entities.

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image