• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

finding encryption time

 
bashir adil
Greenhorn
Posts: 9
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
hello everybody,, i have implemented RSA security algorithm in java and now i want to know about the encryption time for the input message so as to compare it for different message lengths. kindly help
 
Richard Tookey
Bartender
Posts: 1166
17
Java Linux Netbeans IDE
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Are you wanting a theoretical analysis of the time taken or are you wanting to measure the time? If the former then you will have to do the analysis yourself since the encryption time will depend significantly your implementation detail and on how you break the data down into chunks smaller than the modulus. If the latter then you can use System.currentTimeMillis() to measure the elapsed time.
 
bashir adil
Greenhorn
Posts: 9
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks, System.currentTimeMillis() is known to me and i am using System.nanoTime(). Is it the correct method or there is some more appropriate method of doing this??
 
Paul Clapham
Sheriff
Posts: 20998
31
Eclipse IDE Firefox Browser MySQL Database
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Yes, those two methods are how you measure how long Java code takes to run. There aren't specific methods for measuring the time that specific things (like RSA encryption) take, that's no way to design a language.
 
Richard Tookey
Bartender
Posts: 1166
17
Java Linux Netbeans IDE
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
When one is trying to measure the run time for an algorithm it is normally better to get the total run time for a fairly large number of runs and then work out the average. Also, since the JIT compiler executes at run time it is normal to disregard the first few runs so that the time taken by the JIT does not distort the averages.
 
Henry Wong
author
Marshal
Pie
Posts: 21024
78
C++ Chrome Eclipse IDE Firefox Browser Java jQuery Linux VI Editor Windows
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Richard Tookey wrote:When one is trying to measure the run time for an algorithm it is normally better to get the total run time for a fairly large number of runs and then work out the average. Also, since the JIT compiler executes at run time it is normal to disregard the first few runs so that the time taken by the JIT does not distort the averages.



I would go a few steps further... I would...

1. Do hundreds of "iterations" with one time stamp before and one after -- and get an overall average as one data point. Reason, the time for one iteration may be smaller than the resolution of the clock.

2. Do hundreds of thousands iterations of the calculation in step one (get something like 100,000 data points) -- and throw it away.... reason is the JIT as already mentioned.

3. Then do millions of iterations of the calculation is step one (get something like 1,000,000 data points). Why? You need to do a min, max, median, mean, standard deviation, etc. If you are getting a large standard deviation, or a really large max, then it is possible that you are running into GC issues (measuring the garbage collector).

Henry
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic