Granny's Programming Pearls
"inside of every large program is a small program struggling to get out"
JavaRanch.com/granny.jsp
The moose likes Java in General and the fly likes RandomAccessFile problem during modification Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Java in General
Bookmark "RandomAccessFile problem during modification" Watch "RandomAccessFile problem during modification" New topic
Author

RandomAccessFile problem during modification

Santana Iyer
Ranch Hand

Joined: Jun 13, 2005
Posts: 335
I have created file,

RandomAccessFile f = new RandomAccessFile("hello.maptxt", "rw");

f.writeBytes("A1?;D123\n");
f.writeBytes("DEF;ABC\n");
f.writeBytes("B*;EST\n");
f.writeBytes("C?;D13512332\n");


Now I want to modify record DEF;ABC with ABC;DEF.

RandomAccessFile f = new RandomAccessFile("hello.maptxt", "rw");

while(true) {
long begin = f.getFilePointer();
String str = f.readLine();

if(str.equals("DEF;ABC")) {
f.seek(begin);
f.writeBytes("ABC;DEF\n");

break;
}
if(str==null) {
break;
}
System.out.println(str);
}
This works fine but whenever number of char differs than next record gets affected.

like if I want to modify ABC EF with ABC EFG now next record gets affected?

what to do?
Peter Chase
Ranch Hand

Joined: Oct 30, 2001
Posts: 1970
There's no operation for replacing different-sized data in a RandomAccessFile, therefore you have to do it yourself. If the data you are replacing is a different length to the new data, you must move the rest of the file contents appropriately. If the file is known to be small, this is pretty easy. If it could be big, you need to take plenty of care.

In general, it is bad to have a data file structure that requires you to do this type of thing. If you have the opportunity to change the file format, you probably should do so. You could consider allowing enough space for any likely value, and filling unused spaces with some special byte. Or you could go for a more advanced format, using indirection within the file.

On a minor point, you have a bug in your code regarding line endings. You are writing the line ending as '\n', and reading using readLine(). These two are not guaranteed to use the same line endings (I guess they do on the platform you currently use, if you program "works"). In addition, you should be aware that you are implicitly doing lots of String/byte and byte/String conversions using the JVM's default locale.
[ September 22, 2006: Message edited by: Peter Chase ]

Betty Rubble? Well, I would go with Betty... but I'd be thinking of Wilma.
Joe Ess
Bartender

Joined: Oct 29, 2001
Posts: 8876
    
    8

JavaRanch IO FAQ: Edit An Existing File


"blabbing like a narcissistic fool with a superiority complex" ~ N.A.
[How To Ask Questions On JavaRanch]
Santana Iyer
Ranch Hand

Joined: Jun 13, 2005
Posts: 335
Thanks Peter and Joe.

My requirement is that I should be able to write, read and modify file contents.
And file will contain around 1,00,000 records.

Creating temp file does not seem to be good solution to me.

What do you suggest?
[ September 22, 2006: Message edited by: Santana Iyer ]
William Brogden
Author and all-around good cowpoke
Rancher

Joined: Mar 22, 2000
Posts: 12769
    
    5
Creating temp file does not seem to be good solution to me.

I guarantee it will be easier than coming up with an "in-place" editing scheme that can handle un-equal record lengths. Furthermore, note that if in-place editing fails for any reason, the original data file will be corrupted.

If you absolutely have to pursue in-place editing of files too big to hold in memory, look at the way word processors handle this sort of thing.
Bill
Santana Iyer
Ranch Hand

Joined: Jun 13, 2005
Posts: 335
Thankyou All, thanks for suggestions.
Santana Iyer
Ranch Hand

Joined: Jun 13, 2005
Posts: 335
If I fix length of every record say 30 bytes. (Putting whitespaces so that every record has 30 bytes) than can I go for modifying same file.

Because I can have 2 million records in a file.

What is your suggestion.
Joe Ess
Bartender

Joined: Oct 29, 2001
Posts: 8876
    
    8

if you can guarantee that records will not exceed a fixed size, RandomAccessFile will work fine. Of course, you have to consider what to do about inserting, deleting and even finding records in that big of a file. At some point you are doing a lot of work creating a low-level database and less work on your problem. At that point it's easier to use an in-process database like Berkeley DB or, for larger applications, a full-blown RDBMS.
[ September 27, 2006: Message edited by: Joe Ess ]
Santana Iyer
Ranch Hand

Joined: Jun 13, 2005
Posts: 335
Thanks yes you are right regarding use of db even we suggested that but requirement at other end is like this i.e. to use file and no db.

regarding deletion I am thinking of creating temp and renaming.
Steve Fahlbusch
Bartender

Joined: Sep 18, 2000
Posts: 562
    
    7

I guess the question to ask is why the requirement of a file not a db - berkley is quite good, supports transactions and rollbacks and multiple types of indexing.

Well what about packages that handle indexed files (such as b-trees, or ISAM/VSAM (I guess this could be called JSAM))?

And note: usually for such large files, one uses a delete flag then you can compact the file after it is backed up.
 
 
subject: RandomAccessFile problem during modification