开发者

IUserType Corresponding to Int64 With ODP.net

We have a custom type that mostly corresponds to System.Int64 that's used 开发者_如何学Cfor our system identifiers (it provides a few extra constraints on what the underlying long value can be).

We've also got an IUserType defined to allow us to map properties of this type with NHibernate, but when running against an oracle database we run into what it seems is known as the "Int/Decimal Problem"

The relevant parts of the user type (get / set) are here:

public object NullSafeGet(IDataReader rs, string[] names, object owner) {
    var longValue = Convert.ToInt64(NHibernateUtil.Int64.NullSafeGet(rs, names[0]));
    return new Identifier(longValue);
}

public void NullSafeSet(IDbCommand cmd, object value, int index) {
    if (value == null || ((Identifier)value).IsNew) {
        NHibernateUtil.Int64.NullSafeSet(cmd, null, index);
        return;
    }
    NHibernateUtil.Int64.NullSafeSet(cmd, ((Identifier)value).Value, index);
}

We're currently getting invalid cast exceptions from the ISession's SaveOrUpdate method when trying to save anything that contains an identifier property.

Is there a good (or even an alright) way to handle this issue in a way that works both for oracle and SQL server, or will we need to create two different UserType implementations?


After some more investigation it appears the problem may be with the sequence-identity generation strategy. It attempts to use a query like this:

INSERT INTO my_entity (id, name) 
VALUES (hibernate_sequence.nextval, :p0) 
returning id into :nhIdOutParam

To make a sequence based identity work similarly to the way it would on SQL server (allowing an insert and ID retrieval with a single query). I'm guessing that the out param comes through as a decimal and that is where I'm having problems.

Switching to the "native" generation strategy it is working, but I'd still like to hear any other insights on this.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜