开发者

How to make sure changes in the database that happen within a transaction are also seen by everything within that transaction

I've got a transaction that saves or updates a set of objects in a database. This is the code:

@Transactional
public void updatePDBEntry(Set<PDBEntry> pdbEntrySet) {
    for (PDBEntry pdbEntry : pdbEntrySet) {
        PDBEntry existingEntry = findByAccessionCode(pdbEntry.getAccessionCode());
        if (existingEntry != null) {
            log.debug("Remove previous version of PDBEntry {}", existingEntry);
            makeTransient(existingEntry);
        }
        makePersistent(pdbEntry);
    }
}

And in the genericDAO:

public void makePersistent(I entity) {
    getCurrentSession().saveOrUpdate(entity);
}

public void makeTransient(I entity) {
    getCurrentSession().delete(entity);
}

Somehow this doesn't work, it says it can't insert the object because of a duplicate key, even though I can see in the logs that it gets to makeTransient(). I guess this has to do with the fact that all this is happening within a transaction, so the changes made by makeTransient() might not be seen by the makePersistent() method. I could solve this by copying all data from pdbEntry into existingEntry and then doing saveOrUpdate(existingEntry) but that is sort of a dirty hack. Is there another way to make sure the makeTransient is visible to makePersistent, while still keeping it all within a transaction?

EDIT: This is my PDBEntry domain model:

@Entity
@Data
@NoArgsConstructor(access = AccessLevel.PROTECTED)
@EqualsAndHashCode(callSuper = false, of = { "accessionCode", "date" })
@SuppressWarnings("PMD.UnusedPrivateField")
public class PDBEntry extends DomainObject implements Serializable {
    @NaturalId
   开发者_运维问答 @NotEmpty
    @Length(max = 4)
    private String          accessionCode;

    @NaturalId
    @NotNull
    @Temporal(TemporalType.DATE)
    private Date            date;

    private String          header;

    private Boolean         isValidDssp;

    @Temporal(TemporalType.TIMESTAMP)
    private Date            lastUpdated     = new Date(System.currentTimeMillis());

    @OneToOne(mappedBy = "pdbEntry", cascade = CascadeType.ALL)
    private ExpMethod       expMethod;

    @OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
    private Set<Refinement> refinementSet   = new HashSet<Refinement>();

    @OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
    private Set<HetGroup>   hetGroupSet     = new HashSet<HetGroup>();

    @OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
    private Set<Chain>      chainSet        = new HashSet<Chain>();

    @OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
    private Set<Chain>      residueSet      = new HashSet<Chain>();

    public Date getLastUpdated() {
        return new Date(lastUpdated.getTime());
    }

    public void setLastUpdated() throws InvocationTargetException {
        throw new InvocationTargetException(new Throwable());
    }

    public void touch() {
        lastUpdated = new Date(System.currentTimeMillis());
    }

    @Override
    public String toString() {
        return accessionCode;
    }

    public PDBEntry(String accessionCode, Date date) throws NullPointerException {
        if (accessionCode != null && date != null) {
            this.accessionCode = accessionCode;
            this.date = date;
        } else {
            throw new NullPointerException();
        }
    }
}

@MappedSuperclass
public abstract class DomainObject implements Serializable {
    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    private Long    id;

    public Long getId() {
        return id;
    }

    @Override
    public abstract boolean equals(Object obj);

    @Override
    public abstract int hashCode();

    @Override
    public abstract String toString();
}

EDIT: New problem, I've created a method that first deletes all the existing objects from the database like so:

@Override
@Transactional
public void updatePDBEntries(Set<PDBEntry> pdbEntrySet) {
    findAndRemoveExistingPDBEntries(pdbEntrySet);
    savePDBEntries(pdbEntrySet);
}

@Override
@Transactional
public void findAndRemoveExistingPDBEntries(Set<PDBEntry> pdbEntrySet) {
    for (PDBEntry pdbEntry : pdbEntrySet) {
        PDBEntry existingPDBEntry = findByAccessionCode(pdbEntry.getAccessionCode());
        if (existingPDBEntry != null) {
            log.info("Delete: {}", pdbEntry);
            makeTransient(existingPDBEntry);
        }
    }
}

@Override
@Transactional
public void savePDBEntries(Set<PDBEntry> pdbEntrySet) {
    for (PDBEntry pdbEntry : pdbEntrySet) {
        log.info("Save: {}", pdbEntry);
        makePersistent(pdbEntry);
    }
}

It seems to delete the first 73 entries it encounters, but then gives an error:

WARN  2010-10-25 14:28:49,406 main JDBCExceptionReporter:100 - SQL Error: 0, SQLState: 23503
ERROR 2010-10-25 14:28:49,406 main JDBCExceptionReporter:101 - Batch entry 0 /* delete nl.ru.cmbi.pdbeter.core.model.domain.PDBEntry */ delete from PDBEntry where id='74' was aborted.  Call getNextException to see the cause.
WARN  2010-10-25 14:28:49,406 main JDBCExceptionReporter:100 - SQL Error: 0, SQLState: 23503
ERROR 2010-10-25 14:28:49,406 main JDBCExceptionReporter:101 - ERROR: update or delete on table "pdbentry" violates foreign key constraint "fke03a2dc84d44e296" on table "hetgroup"
  Detail: Key (id)=(74) is still referenced from table "hetgroup".
ERROR 2010-10-25 14:28:49,408 main AbstractFlushingEventListener:324 - Could not synchronize database state with session
org.hibernate.exception.ConstraintViolationException: Could not execute JDBC batch update

Any ideas how this error might arise?

EDIT: I found the problem: the last error was caused by the hetgroup model, which is as follows:

@Entity
@Data
@NoArgsConstructor(access = AccessLevel.PROTECTED)
@EqualsAndHashCode(callSuper = false, of = { "pdbEntry", "hetId" })
@SuppressWarnings("PMD.UnusedPrivateField")
// extends DomainObject which contains Id, NaturalId is not enough in this case, since duplicate chains still exist
// in fact this is an error of the PDBFinder and will be fixed in the future
public class HetGroup extends DomainObject implements Serializable {
    //@NaturalId 
    @NotNull
    @ManyToOne
    private PDBEntry    pdbEntry;

    //@NaturalId 
    @NotEmpty
    private String hetId;

    private  Integer nAtom;

    @Length(max = 8192)
    private String name;

    public HetGroup(PDBEntry pdbEntry, String hetId) {
        this.pdbEntry = pdbEntry;
        pdbEntry.getHetGroupSet().add(this);

        this.hetId = hetId;
    }
}

Especially pay attention to the commented @NaturalId's, I commented these because there was still some error in my data that caused duplicate hetgroups, so I thought I'd just remove the UNIQUE constraint on them for now, but I forgot that I was using Lombok to create equals and hashcode methods for me, as can be seen from the line:

@EqualsAndHashCode(callSuper = false, of = { "pdbEntry", "hetId" })

This caused the error with the duplicate HetId's. I fixed the data and reinstated the @NaturalId's, and now everything works fine.

Thanks all!


Somehow this doesn't work, it says it can't insert the object because of a duplicate key, even though I can see in the logs that it gets to makeTransient().

To understand what is happening here, you need first to understand that Hibernate doesn't immediately write changes to the database, changes are enqueued in the session and written at flush time. So even if you see that makeTransient() is called, this doesn't mean the corresponding record has been actually deleted from the database at the time the method is called. The SQL delete statement, and other pending change(s) will be executed when the flush will occur (either by explicitly calling flush(), at commit() time, or when performing a HQL query). This is well explained in the documentation:

10.10. Flushing the Session

Sometimes the Session will execute the SQL statements needed to synchronize the JDBC connection's state with the state of objects held in memory. This process, called flush, occurs by default at the following points:

  • before some query executions
  • from org.hibernate.Transaction.commit()
  • from Session.flush()

The SQL statements are issued in the following order:

  1. all entity insertions in the same order the corresponding objects were saved using session.save()
  2. all entity updates
  3. all collection deletions
  4. all collection element deletions, updates and insertions
  5. all collection insertions
  6. all entity deletions in the same order the corresponding objects were deleted using Session.delete()

An exception is that objects using native ID generation are inserted when they are saved.

...

So, let's look again at your code:

01: @Transactional
02: public void updatePDBEntry(Set<PDBEntry> pdbEntrySet) {
03:     for (PDBEntry pdbEntry : pdbEntrySet) {
04:         PDBEntry existingEntry = findByAccessionCode(pdbEntry.getAccessionCode());
05:         if (existingEntry != null) {
06:             log.debug("Remove previous version of PDBEntry {}", existingEntry);
07:             makeTransient(existingEntry);
08:         }
09:         makePersistent(pdbEntry);
10:     }
11: }
  • at 04, you perform a query (this will flush pending changes if any)
  • at 07, you perform a session.delete() (in memory)
  • at 09, you perform a session.save() (in memory)
  • back to 04, changes are flushed and Hibernate issues SQL statements in the above specified order
    • and the insert fail because of a key violation because the record hasn't been deleted yet

In other words, your problem has nothing to do with transactions, it's just related to the way Hibernate works and the way you use it.

Is there another way to make sure the makeTransient is visible to makePersistent, while still keeping it all within a transaction?

Well, without changing the logic, you could help Hibernate and flush() explicitly after the delete().

But IMO, the whole approach is is suboptimal and you should update the existing record if any instead of delete then insert (unless there are good reasons to do so).

References

  • Hibernate Core Reference Guide
    • 10.10. Flushing the Session


The single transaction should not be the problem.

Your suggestion - copy the data from pdbEntry into existingEntry is a far better solution. It's a lot less database intensive for one and a little easier to read and understand what is going on.

Also, if you did it this way, you would not be required to do a saveOrUpdate on the changed object. Hibernate implements what is known as "transparent persistence"... which means that hibernate is responsible for working out what data operations need to be employed to synchronise the object with the database. The "Update" operation in hibernate is not for objects that are already persistent. see here

The code would in this case look like this (note not updated required):

    public void updatePDBEntry(Set<PDBEntry> pdbEntrySet) {
        for (PDBEntry pdbEntry : pdbEntrySet) {
            PDBEntry existingEntry = findByAccessionCode(pdbEntry.getAccessionCode());
            if (existingEntry != null) {
                // copy relevant fields from pdbEntry to existing Entry - preferably with a method on PDBEntry
            } else {
                makePersistent(pdbEntry); // although better to just call _save_ (not saveOrUpdate) cause you know it's a new object
            }
        }            
    }
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜