开发者

Datanucleus exception adding new column to hbase table

I am using DataNucleus with HBase. I had a table user. It contained 4 rows. Now I added a new column to the table. Now everytime I access any old user object which does not have this column DataNucleus throws an exception as it is trying to map the column with the property in the POJO. Is there no other way than updating the old 'user' objects with dummy data? My object mapping looks something like this:

@Persistent(columns={@Column(name="next_mail_timestamp", insertValue="#NULL", defaultValue = "#NULL", allowsNull = "true")}, name="nextMailTimestamp", cacheable="false", nullValue=NullValue.DEFAULT)
    private long nextMailTimestamp;

As you can see I have tried using insertValue, defaultValue , allowsNull, nullValue. But nothing seems to work.

The stacktrace looks like this:

at org.apache.hadoop.hbase.util.Bytes.toLong(Bytes.java:479)
    at org.apache.hadoop.hbase.util.Bytes.toLong(Bytes.java:453)
    at org.datanucleus.store.hbase.fieldmanager.FetchFieldManager.fetchLongField(FetchFieldManager.java:269)
    at org.datanucleus.state.AbstractStateManager.replacingLongField(AbstractStateManager.java:2133)
    at com.kuliza.sitepulse.data.User.jdoReplaceField(User.java)
    at com.kuliza.sitepulse.data.User.jdoReplaceFields(User.java)
    at org.datanucleus.state.JDOStateManagerImpl.replaceFields(JDOStateManagerImpl.java:1989)
    at org.datanucleus.state.JDOStateManagerImpl.replaceFields(JDOStateManagerImpl.java:2009)
    at org.datanucleus.store.hbase.query.HBaseQueryUtils$2开发者_StackOverflow.fetchFields(HBaseQueryUtils.java:226)
    at org.datanucleus.state.JDOStateManagerImpl.loadFieldValues(JDOStateManagerImpl.java:803)
    at org.datanucleus.state.JDOStateManagerImpl.initialiseForHollow(JDOStateManagerImpl.java:210)
    at org.datanucleus.state.ObjectProviderFactory.newForHollowPopulated(ObjectProviderFactory.java:88)
    at org.datanucleus.ObjectManagerImpl.findObject(ObjectManagerImpl.java:2794)
    at org.datanucleus.store.hbase.query.HBaseQueryUtils.getObjectUsingApplicationIdForResult(HBaseQueryUtils.java:221)
    at org.datanucleus.store.hbase.query.HBaseQueryUtils.getObjectsOfType(HBaseQueryUtils.java:168)
    at org.datanucleus.store.hbase.query.HBaseQueryUtils.getObjectsOfCandidateType(HBaseQueryUtils.java:80)
    at org.datanucleus.store.hbase.query.JDOQLQuery.performExecute(JDOQLQuery.java:271)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1766)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1655)
    at org.datanucleus.store.query.Query.execute(Query.java:1628)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)
    at com.kuliza.sitepulse.service.DataService.getUserWithCredentials(DataService.java:111)
    at com.kuliza.sitepulse.service.AuthenticationService.getUserWithCredentials(AuthenticationService.java:46)
    at com.kuliza.sitepulse.controller.AuthenticationController.signIn(AuthenticationController.java:69)

and my method is (in DataService.java:111)(which throws the exception)

@Override
    public User getUserWithCredentials(String userName, String password){
        PersistenceManager pm = pmf.getPersistenceManager();
        Query q = pm.newQuery("SELECT FROM " + User.class.getName() + " WHERE userName == \""+userName+"\"" +" && password == " +
                " \""+password + "\"");
        List<User> c = (List<User>)q.execute();
        pm.close();
        if(c.size() > 0)
            return c.get(0);
        else
            return null;
    }

I have actually added two new columns (mailIntervalInMilliseconds, nextMailTimestamp) which are both long and in the stacktrace I see its trying to convert the db column to Long (AFAIK)


Released versions of DataNucleus HBase plugin don't currently support all modes of schema evolution. In particular the addition of fields/properties when they are of primitive types. However DataNucleus SVN does have support for this, so you could use that.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜