hello everybody,, i have implemented RSA security algorithm in java and now i want to know about the encryption time for the input message so as to compare it for different message lengths. kindly help
Are you wanting a theoretical analysis of the time taken or are you wanting to measure the time? If the former then you will have to do the analysis yourself since the encryption time will depend significantly your implementation detail and on how you break the data down into chunks smaller than the modulus. If the latter then you can use System.currentTimeMillis() to measure the elapsed time.
Joined: Apr 16, 2013
Thanks, System.currentTimeMillis() is known to me and i am using System.nanoTime(). Is it the correct method or there is some more appropriate method of doing this??
Yes, those two methods are how you measure how long Java code takes to run. There aren't specific methods for measuring the time that specific things (like RSA encryption) take, that's no way to design a language.
When one is trying to measure the run time for an algorithm it is normally better to get the total run time for a fairly large number of runs and then work out the average. Also, since the JIT compiler executes at run time it is normal to disregard the first few runs so that the time taken by the JIT does not distort the averages.
Richard Tookey wrote:When one is trying to measure the run time for an algorithm it is normally better to get the total run time for a fairly large number of runs and then work out the average. Also, since the JIT compiler executes at run time it is normal to disregard the first few runs so that the time taken by the JIT does not distort the averages.
I would go a few steps further... I would...
1. Do hundreds of "iterations" with one time stamp before and one after -- and get an overall average as one data point. Reason, the time for one iteration may be smaller than the resolution of the clock.
2. Do hundreds of thousands iterations of the calculation in step one (get something like 100,000 data points) -- and throw it away.... reason is the JIT as already mentioned.
3. Then do millions of iterations of the calculation is step one (get something like 1,000,000 data points). Why? You need to do a min, max, median, mean, standard deviation, etc. If you are getting a large standard deviation, or a really large max, then it is possible that you are running into GC issues (measuring the garbage collector).