I would like to run a type of search function on a text file using StringTokenizers. If the user-inputted value is equal to a specific token (for example, always the first token of a line), then the third token of that same line should be added into an array.
The problem I am encountering is that I don't if it's possible to add data to an array before its length has been declared (and thus the array isn't even instantiated), and I can't have extra elements at the end; nor can I be short on elements - its length must be exactly equal to the number of matches in the text file. It has to be able to handle any number of elements. Also, I would like to not have to add a number to a count and then go back through the file again to get the third token (since StringTokenizers are only good for one token), as that's really inconvenient and inefficient.
Is there any way that you guys know of that I can somehow efficiently do this? Any sort of guidance at all would be very appreciated.
Joined: Nov 15, 2007
Never mind, I actually figured out how to solve my own problem efficiently - at least as efficiently as I can think of right now. I know how many entries (lines of text) there are because that's the first line in the text file, so I can make one array that length. I can then store all the matches in that array and keep a running count of matches. After I go through the entire file, I can just instantiate the final array with a length equal to the number of matches I counted. After that, it's simply iterating through the original array and copying values only until the index I'm on is equal to the number of matches I counted. This will fill the array completely with what i need with no extra fluff and nothing missing!
If anybody can see any problems with my logic here, please let me know!