This week's book giveaway is in the OCPJP forum. We're giving away four copies of OCA/OCP Java SE 7 Programmer I & II Study Guide and have Kathy Sierra & Bert Bates on-line! See this thread for details.
My aim is to generate a 3D kind of grid. Here the horizontal rows are equally spaced whereas the vertical lines will be at some angle to the horizontal so that the appearance is like some 3D floor. (I have pasted my code below)
I have is finite(pre-defined) number of rows and columns.
The points x1,y1 and x2,y2 denote the bottom-most horizontal line.
The angle the left-most vertical line makes with the x-axis is theta and is assumed to be 60 degrees.
The length of the top-most horizontal line is 1/4th the length of the bottom-most horizontal line.
Using the above data/assumptions and based on proportionality theorem and some straight line formula I have generated this pattern.
My next work is to determine the intersection points for each row and column. For this I have done the following:
- I determined the perpendicular distance between each row and say this is some 'd'.
- Let the first point in the first row be some (x,y)
- Now the actual distance(non-perpendicular) between 2 rows can be deduced based on each column angle (say some theta)
- So the actual distance 'radius' will be = d/Sin(theta)
- Then I used the polar form of line equation to get the actual x,y co-ordinates as '(x+radius*Cos(theta), y-radius*Sin(theta))'
I suppose the method of determination is mathematically correct. But for the sake of verification, when I connected the intersection points and the origin,I find them to be non-accurate. There seems to be some error. Can somebody please help out on this??
P.S: Please try running the code pasted below to see what happens.(check out the method findPoints())
subject: Intersection points not being determined accurately- java awt