• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Tim Cooke
  • Liutauras Vilda
  • Jeanne Boyarsky
  • paul wheaton
Sheriffs:
  • Ron McLeod
  • Devaka Cooray
  • Henry Wong
Saloon Keepers:
  • Tim Holloway
  • Stephan van Hulst
  • Carey Brown
  • Tim Moores
  • Mikalai Zaikin
Bartenders:
  • Frits Walraven

NX (Bodgitt and Scarper, LLC): recNo primitive long on DBAccess Interface methods

 
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hey All,
I had hoped to read all of the records into memory and then only access disk on create/delete/update calls. However, since recNo is a long, I'm kind of stuck because I can't use a Collection and I can't use an array (since I could have more than <the max int value> number of records).
Does anyone have any ideas? Or am I just stuck reading from disk on every "find call"? I'm posting the interface below.
*****************************************
package suncertify.db;
public interface DBAccess
{
// Reads a record from the file. Returns an array where each
// element is a record value.
public String [] readRecord(long recNo)
throws RecordNotFoundException;
// Modifies the fields of a record. The new value for field n
// appears in data[n]. Throws SecurityException
// if the record is locked with a cookie other than lockCookie.
public void updateRecord(long recNo, String[] data, long lockCookie)
throws RecordNotFoundException, SecurityException;
// Deletes a record, making the record number and associated disk
// storage available for reuse.
// Throws SecurityException if the record is locked with a cookie
// other than lockCookie.
public void deleteRecord(long recNo, long lockCookie)
throws RecordNotFoundException, SecurityException;
// Returns an array of record numbers that match the specified
// criteria. Field n in the database file is described by
// criteria[n]. A null value in criteria[n] matches any field
// value. A non-null value in criteria[n] matches any field
// value that begins with criteria[n]. (For example, "Fred"
// matches "Fred" or "Freddy".)
public long[] findByCriteria(String[] criteria);
// Creates a new record in the database (possibly reusing a
// deleted entry). Inserts the given data, and returns the record
// number of the new record.
public long createRecord(String [] data)
throws DuplicateKeyException;
// Locks a record so that it can only be updated or deleted by this client.
// Returned value is a cookie that must be used when the record is unlocked,
// updated, or deleted. If the specified record is already locked by a different
// client, the current thread gives up the CPU and consumes no CPU cycles until
// the record is unlocked.
public long lockRecord(long recNo)
throws RecordNotFoundException;
// Releases the lock on a record. Cookie must be the cookie
// returned when the record was locked; otherwise throws SecurityException.
public void unlock(long recNo, long cookie)
throws SecurityException;
}
*****************************************
Thanks,
Tim
 
Wanderer
Posts: 18671
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Your recNo is a long? Wow. Mine is int; I assumed that's what everyone else had. Hmmm... I suppose you could still put everything in memory using a List of List, or array or arrays, or List of arrays, or whatever. But, well, if recNo really needs to be long, if there are really that many records - well, maybe you shouldn't be putting them all in memory. Stick to reading from the file. And anyone who ever calls find() will just have to accept that if the file is huge, the performance is going to suck. If the company has that many records, they should really be using a real database. Just do what's simplest, keeping nothing in memory, and don't worry about possible crappy performance, it's not your fault.
 
Ranch Hand
Posts: 435
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
My record number was a long as well, I wrapped it in a Long and put it in a Map.
Tony
 
Timothy Johnson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hey Fellas,
Something occurred to me though... since the #findByCriteria method returns a long[], this implies that there shouldn't be more than <max value for int> records anyway. So that leaves me thinking that 'recNo' is really the byte offset into the file for a record.
Even with this assumption, I'm still thinking that it might be better to just read from disk on searches (as opposed to pulling everything into memory) because of possible memory utilization issues.
Thanks for your feedback, any additional thoughts?
-Tim
 
Bartender
Posts: 1872
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi Timothy,
You evoke two possible solutions to read records from file : the "cache-all" one and the "no-cahe-at-all" one.
What about a third one : "partial-caching" ? That's the one I implemented, and the most typical cache design IMO BTW . It's a little bit harder to design and code, because you need some cleanup process (mine takes individual records hits and last access times into account to choose the right "candidates" to be cleared).
Best,
Phil.
 
Timothy Johnson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hey Fellas,
I've gone through and implemented all of the methods but #createRecord and the reason that I stopped is because the duplicate key exception that this method could throw makes no sense to me. As far as I'm concerned, the client is going to be passing in an array with 6 elements (name, location, specialties, rate, size, and owner), the fields within the db file, and that's it. The key generation is a function of the db system??? I'm not sure what type of a scenario would cause me to throw this exception and I wanted to see if you guys have any thoughts.
Thanks again,
Tim
 
author and jackaroo
Posts: 12200
280
Mac IntelliJ IDE Firefox Browser Oracle C++ Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi Tim,
At the top of this page, under the "Post new topic" and "Post reply" buttons is a link to our search utility.
If you click on it, and enter "DuplicateKeyException" you will find plenty of posts where this has been discussed before.
Regards, Andrew
 
Timothy Johnson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Thanks Andrew, sorry about that.
-Tim
 
That's a very big dog. I think I want to go home now and hug this tiny ad:
Gift giving made easy with the permaculture playing cards
https://coderanch.com/t/777758/Gift-giving-easy-permaculture-playing
reply
    Bookmark Topic Watch Topic
  • New Topic