atom feed9 messages in org.apache.hadoop.hbase-userUnique row ID constraint
FromSent OnAttachments
Tatsuya KawanoApr 28, 2010 7:40 am 
StackApr 28, 2010 9:42 am 
Ryan RawsonApr 28, 2010 9:41 pm 
Tatsuya KawanoApr 29, 2010 1:33 am 
Todd LipconApr 29, 2010 9:36 am 
Guilherme GermoglioApr 29, 2010 9:58 am 
Michael SegelApr 29, 2010 1:08 pm 
Tatsuya KawanoApr 30, 2010 9:31 am 
Tatsuya KawanoMay 8, 2010 4:21 pm 
Subject:Unique row ID constraint
From:Tatsuya Kawano (tats@snowcocoa.info)
Date:Apr 28, 2010 7:40:03 am
List:org.apache.hadoop.hbase-user

Hi,

I'd like to implement unique row ID constraint (like the primary key constraint in RDBMS) in my application framework.

Here is a code fragment from my current implementation (HBase 0.20.4rc) written in Scala. It works as expected, but is there any better (shorter) way to do this like checkAndPut()? I'd like to pass a single Put object to my function (method) rather than passing rowId, family, qualifier and value separately. I can't do this now because I have to give the rowLock object when I instantiate the Put.

=============================================== def insert(table: HTable, rowId: Array[Byte], family: Array[Byte], qualifier: Array[Byte], value: Array[Byte]): Unit = {

val get = new Get(rowId)

val lock = table.lockRow(rowId) // will expire in one minute try { if (table.exists(get)) { throw new DuplicateRowException("Tried to insert a duplicate row: " + Bytes.toString(rowId))

} else { val put = new Put(rowId, lock) put.add(family, qualifier, value)

table.put(put) }

} finally { table.unlockRow(lock) }

} ===============================================

Thanks,

twitter: http://twitter.com/tatsuya6502