'Unique constraint with saveAll from JpaRepository
My application receives arrays containing objects, which are mapped to entities and then persisted.
The object has an id property that is mapped to a column with a unique constraint placed on it. Objects with duplicate ids may be sent to the application.
I'm using the saveAll method, but it throws during insertion if any of the objects happens to violate the unique constraint.
Is there an easy way to make it insert all the non-duplicates and ignore the duplicates, or simply update them?
I've tried overriding hash and equals, but that didn't help.
This application receives dozens of requests per second and he amount of duplicates I expect to receive is low.
entity:
@Table(name = "table", uniqueConstraints = @UniqueConstraint(columnNames = "natural_id"))
public class Entity {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@Column(name = "natural_id", unique = true)
private String naturalId;
// ...
}
repository:
public interface Repository extends JpaRepository<Entity, Long>, JpaSpecificationExecutor<Entity> {}
saving method:
void save(List<Entities> entities) {
try {
repository.saveAll(entities);
} catch (Exception e) {
log.error("...");
throw e;
}
}
If I were to change the saving method to:
@Transactional
void save(List<Entities> entities) {
for (Entities et: entities) {
try {
repository.save(et);
} catch (Exception e) {
log.error("...");
}
}
}
Would that make it all happen within a single transaction? I'm worried because the save method usually creates a new transaction every time it is called, and that would make it extremely slow for an application that receives a decent number of requests per second.
I might ultimately have to query the database before insertion to filter the duplicates out for insertion, and do that again every time a constraint violation exception is caught. It should work, but is quite ugly.
Solution 1:[1]
You have to save individual records because save all will not be resume/recover if faced problem with some record and then other record will also not saved as all are in same transaction.
As you suggested query the database before insertion to filter the duplicates out instead of that better try for loop with try catch as you mentioned ''dozens of requests per second'' so it wont give much overhead.
Solution 2:[2]
If you already implemented hashCode and equals then use Set instead of List to collect your items before saving them.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Ayush |
| Solution 2 | Svetlana |
