aspose file tools*
The moose likes Beginning Java and the fly likes tokenizer help Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Beginning Java
Bookmark "tokenizer help" Watch "tokenizer help" New topic
Author

tokenizer help

Ellen fish
Greenhorn

Joined: Oct 28, 2008
Posts: 27
Hello,

I'm reading a file that look like this:

drwxr-xr-x 8 user staff 512 Nov 28 16:17 build
drwxr-xr-x 2 user staff 512 Nov 28 16:17 build.xml

and I want to use permission, file size and file name.
ex.(drwxr-xr-x, 512, build)

Right now I'm using nextToken() to get the token I want.
Is there a simpler way to replace these code? such as make the line into array of tokens??

Thank you so much,

StringBuffer str = new StringBuffer();

System.out.println(st1.nextToken());
st1.nextToken();
st1.nextToken();
st1.nextToken();
System.out.println(st1.nextToken());
st1.nextToken();
st1.nextToken();
st1.nextToken();
System.out.println(st1.nextToken());
Piet Verdriet
Ranch Hand

Joined: Feb 25, 2006
Posts: 266
Why not simply split on the white space? The information you're interested in is in index 0, 4 and 8.

Ellen fish
Greenhorn

Joined: Oct 28, 2008
Posts: 27
Thank you, this is exactly what I'm looking for.

However, using String[] tokens = "drwxr-xr-x 8 user staff 512 Dec 1 13:19 .".split("\\s"); I get token.length == 23.
I have also tried other split, but doesn't work.

any ideas?

drwxr-xr-x 8 user staff 512 Dec 1 13:19 .
[ December 01, 2008: Message edited by: Ellen fish ]
Piet Verdriet
Ranch Hand

Joined: Feb 25, 2006
Posts: 266
You're welcome Ellen.
Ellen fish
Greenhorn

Joined: Oct 28, 2008
Posts: 27
use .split("\\s+")
I got it~
hehe thank you
Piet Verdriet
Ranch Hand

Joined: Feb 25, 2006
Posts: 266
Originally posted by Ellen fish:
use .split("\\s+")
I got it~
hehe thank you


Ah yes, the forum probably merged all multiple spaces to a single one.
I see you have found the right solution! Well done.
 
Don't get me started about those stupid light bulbs.
 
subject: tokenizer help