Win a copy of Clojure in Action this week in the Clojure forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Problem in writing extended ASCII characters into a file in japanese machines

 
Suren BabuM
Greenhorn
Posts: 1
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Take for eg., � is a registered sign (falls 174 entry in ASCII table.)
Requirement: To Read this character from database and write into file.
This character is getting returned from database properly. Once i write to file using the following code and opening the file in notepad it is showing as ?. I tried opening in binary mode to check the Hex. value and it is 63 (which is ?) instead of 174 (which is �)
result holds the information fetched from database.
BufferedWriter writer1 = new BufferedWriter ( new FileWriter("FromDatabase_native_CharacterIO.txt") ) ;
writer1.write ( result ) ;
writer1.flush() ;
writer1.close() ;
The following code works fine...
FileOutputStream fos = new FileOutputStream( "FromDatabase_unicode.txt" );
OutputStreamWriter osw = new OutputStreamWriter( fos, "Unicode" );
System.out.println("Encoding: "+ osw.getEncoding()) ;
BufferedWriter bw = new BufferedWriter( osw );
bw.write ( result ) ;
bw.flush() ;
bw.close() ;
I don't want to use encoding to mandate 'Unicode', as i have other problems with that.
Any help is highly appreciated as it is sitting in my head for quite some time troubling a lot.
 
I agree. Here's the link: http://aspose.com/file-tools
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic