aspose file tools*
The moose likes Struts and the fly likes JSP with large data loads slow in Solaris10 OS , we are using Jrun4 and nested logic:iterate Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Frameworks » Struts
Bookmark "JSP with large data loads slow in Solaris10 OS , we are using Jrun4 and nested logic:iterate " Watch "JSP with large data loads slow in Solaris10 OS , we are using Jrun4 and nested logic:iterate " New topic
Author

JSP with large data loads slow in Solaris10 OS , we are using Jrun4 and nested logic:iterate

manjit sharma
Greenhorn

Joined: Nov 02, 2010
Posts: 4
Hi,

I am having an issue about jsp page slow loading in Solaris10 environment. When app server is running in Windows page loads pretty fast.
The JSP page has nested logic:iterate tags with lot of if/else conditions which are checked by using logic:equal. We are trying to load around 4000 rows of data , it loads in about 3 seconds when app server is running on windows box but it takes around 12 seconds on Solaris box.
We have already checked database performance and network latency, both are not the issue.
Jrun4 running on windows and solaris has same configurations and same jar files in Classpath.

The appserver used is Jrun4 and we are using Struts1.2.

The code snippet from JSP is as follows:





I WILL BE VERY THANKFUL IF SOMEONE CAN HELP ME IN THIS.

Thanks,
Manjit
Joe Ess
Bartender

Joined: Oct 29, 2001
Posts: 8997
    
    9

manjit sharma wrote:We are trying to load around 4000 rows of data


The first question you should ask is: is this much data really necessary? What user is going to page through 4k rows?

manjit sharma wrote:
it loads in about 3 seconds when app server is running on windows box but it takes around 12 seconds on Solaris box.


That statistic is worthless without knowing the relative performance of the two machines. A benchmark like DaCapo can give you some good statistics for a valid comparison. It is not uncommon for an organization to upgrade their PC's every few years, but assume that the server is "fast enough". Next thing you know, your cell phone has more computing power than the server and people are wondering why their apps are slow.
EnterprisePerformance


[How To Ask Questions On JavaRanch]
manjit sharma
Greenhorn

Joined: Nov 02, 2010
Posts: 4
Hi Joe- Thanks for your reply.
But the solaris box is having even higher memory then the windows box, its a very fast box.
There is no issue from database side.
I ma really not able to figure out why this code loads the JSP page fast when jrun server runs in windows and rendors same jsp page slowly in solaris.
Yes I agree that 4k rows displaying at once is not a good idea but client wants to see all 4k rows at once.
There is no option of pagination or scroll bar event.
Please help me if possible.
Thanks,
Manjit
Joe Ess
Bartender

Joined: Oct 29, 2001
Posts: 8997
    
    9

manjit sharma wrote:
But the solaris box is having even higher memory then the windows box, its a very fast box.


Memory does not equal speed. My employer has a Sunfire V440 running Solaris 9 that has 4GB of memory and dual CPU's and my laptop with 2GB of memory runs rings around it.


[Thumbnail for daCapoGraph.jpg]

manjit sharma
Greenhorn

Joined: Nov 02, 2010
Posts: 4
Hi Joe- Thanks again for your reply.
But as told by our system engineers that solaris box is a high end box just like production so they have completely ruled out any hardware and network issues with that solaris box.
Do you have any idea if by removing whitespaces from generated HTML may help?
manjit sharma
Greenhorn

Joined: Nov 02, 2010
Posts: 4
Hi ,

I am now trying to compress the jsp when transferring to client BY USING A FILTER.
The jsp is successfully compressed from 2 MB to 38kb when jrun runs on windows box but again when we run jrun on solaris10 the jsp doesn't get compressed.
After putting some debugging statements in filter code I see that on solaris box when dofilter() is called its not internally making a call to getWriter() function.

I am using the following filter:

GZIPFilter.java code is as follows:


package com.ssdt.ain.web;

import java.io.IOException;

import javax.servlet.Filter;
import javax.servlet.FilterChain;
import javax.servlet.FilterConfig;
import javax.servlet.ServletException;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;



public class GZIPFilter implements Filter {

public void doFilter(ServletRequest req, ServletResponse res, FilterChain chain) throws IOException, ServletException {
if (req instanceof HttpServletRequest) {
HttpServletRequest request = (HttpServletRequest) req;
HttpServletResponse response = (HttpServletResponse) res;

if (isGZIPSupported(request)) {

GZIPResponseWrapper wrappedResponse = new GZIPResponseWrapper(response);

chain.doFilter(req, wrappedResponse);
wrappedResponse.finishResponse();

return;
}

chain.doFilter(req, res);
}
}

/**
* Convenience method to test for GZIP capabilities
*
* @param req The current user request
* @return boolean indicating GZIP support
*/
private boolean isGZIPSupported(HttpServletRequest req) {
String browserEncodings = req.getHeader("accept-encoding");
boolean supported = ((browserEncodings != null) && (browserEncodings.indexOf("gzip") != -1));
return supported;
}

public void init(FilterConfig filterConfig) {
//no action here
}

public void destroy() {
//no action here
}
}

GZIPResponseWrapper.java code is as follows:



package com.ssdt.ain.web;

import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.PrintWriter;

import javax.servlet.ServletOutputStream;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.HttpServletResponseWrapper;

public class GZIPResponseWrapper extends HttpServletResponseWrapper {
protected HttpServletResponse origResponse = null;
protected ServletOutputStream stream = null;
protected PrintWriter writer = null;
protected int error = 0;

public GZIPResponseWrapper(HttpServletResponse response) {

super(response);
origResponse = response;

}

public ServletOutputStream createOutputStream() throws IOException {

return (new GZIPResponseStream(origResponse));
}

public void finishResponse() {
try {
if (writer != null) {
writer.close();
} else {
if (stream != null) {
stream.close();
}
}
} catch (IOException e) {
System.out.println("Exception");
}
}

public void flushBuffer() throws IOException {
System.out.println("inside flushBuffer() ");
stream.flush();
}

public ServletOutputStream getOutputStream() throws IOException {
System.out.println("inside getOutputStream() ");
if (writer != null) {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP:getWriter() has already been called!");
throw new IllegalStateException("getWriter() has already been called!");
}

if (stream == null) {
stream = createOutputStream();
}

return (stream);
}

public PrintWriter getWriter() throws IOException {
System.out.println("inside getWriter() writer is " +writer);
// If access denied, don't create new stream or write because
// it causes the web.xml's 403 page to not render
if (this.error == HttpServletResponse.SC_FORBIDDEN) {
return super.getWriter();
}

if (writer != null) {
return (writer);
}

if (stream != null) {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP:getOutputStream() has already been called!");
throw new IllegalStateException("getOutputStream() has already been called!");
}

stream = createOutputStream();
writer = new PrintWriter(new OutputStreamWriter(stream, origResponse.getCharacterEncoding()));

return (writer);
}

public void setContentLength(int length) {
//no action here
}

/**
* @see javax.servlet.http.HttpServletResponse#sendError(int,
* java.lang.String)
*/
public void sendError(int err, String message) throws IOException {
super.sendError(err, message);
this.error = err;

}
}



package com.ssdt.ain.web;

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.util.zip.GZIPOutputStream;

import javax.servlet.ServletOutputStream;
import javax.servlet.http.HttpServletResponse;

public class GZIPResponseStream extends ServletOutputStream {
// abstraction of the output stream used for compression
protected OutputStream bufferedOutput = null;

// state keeping variable for if close() has been called
protected boolean closed = false;

// reference to original response
protected HttpServletResponse response = null;

// reference to the output stream to the client's browser
protected ServletOutputStream output = null;

// default size of the in-memory buffer
private int bufferSize = 2000000;

public GZIPResponseStream(HttpServletResponse response) throws IOException {
super();
closed = false;
this.response = response;
this.output = response.getOutputStream();
bufferedOutput = new ByteArrayOutputStream();
}

public void close() throws IOException {
// verify the stream is yet to be closed
if (closed) {
throw new IOException("This output stream has already been closed");
}

// if we buffered everything in memory, gzip it
if (bufferedOutput instanceof ByteArrayOutputStream) {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP: ByteArrayOutputStream");
// get the content
ByteArrayOutputStream baos = (ByteArrayOutputStream) bufferedOutput;

// prepare a gzip stream
ByteArrayOutputStream compressedContent = new ByteArrayOutputStream();
GZIPOutputStream gzipstream = new GZIPOutputStream(compressedContent);
byte[] bytes = baos.toByteArray();
gzipstream.write(bytes);
gzipstream.finish();

// get the compressed content
byte[] compressedBytes = compressedContent.toByteArray();

// set appropriate HTTP headers
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP: Adding to response");
response.setContentLength(compressedBytes.length);
response.addHeader("Content-Encoding", "gzip");
output.write(compressedBytes);
output.flush();
output.close();
closed = true;
}
// if things were not buffered in memory, finish the GZIP stream and
// response
else if (bufferedOutput instanceof GZIPOutputStream) {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP: GZIPOutputStream");
// cast to appropriate type
GZIPOutputStream gzipstream = (GZIPOutputStream) bufferedOutput;

// finish the compression
gzipstream.finish();

// finish the response
output.flush();
output.close();
closed = true;
}
}

public void flush() throws IOException {
if (closed) {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP:Cannot flush a closed output stream");
throw new IOException("Cannot flush a closed output stream");
}

bufferedOutput.flush();
}

public void write(int b) throws IOException {
if (closed) {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP:Cannot write to a closed output stream");
throw new IOException("Cannot write to a closed output stream");
}

// make sure we aren't over the buffer's limit
checkBufferSize(1);

// write the byte to the temporary output
bufferedOutput.write((byte) b);
}

private void checkBufferSize(int length) throws IOException {
// check if we are buffering too large of a file
if (bufferedOutput instanceof ByteArrayOutputStream) {
ByteArrayOutputStream baos = (ByteArrayOutputStream) bufferedOutput;

if ((baos.size() + length) > bufferSize) {
// files too large to keep in memory are sent to the client without
// Content-Length specified
response.addHeader("Content-Encoding", "gzip");

// get existing bytes
byte[] bytes = baos.toByteArray();

// make new gzip stream using the response output stream
GZIPOutputStream gzipstream = new GZIPOutputStream(output);
gzipstream.write(bytes);

// we are no longer buffering, send content via gzipstream
bufferedOutput = gzipstream;
}
else {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP: Size too big");
}
}
}

public void write(byte[] b) throws IOException {
write(b, 0, b.length);
}

public void write(byte[] b, int off, int len) throws IOException {

if (closed) {
WebLogger.logOutputMessage(WebLogger.DATABASE, 1, "GZIP:Cannot write to a closed output stream");
throw new IOException("Cannot write to a closed output stream");
}

// make sure we aren't over the buffer's limit
checkBufferSize(len);

// write the content to the buffer
bufferedOutput.write(b, off, len);
}

public boolean closed() {
return (this.closed);
}

public void reset() {
//noop
}
}


in web.xml i HAVE FOLLOWING:

<filter>
<filter-name>Compress</filter-name>
<filter-class>package com.ssdt.ain.web.GZIPFilter</filter-class>
</filter>

<filter-mapping>
<filter-name>Compress</filter-name>
<url-pattern>*.do</url-pattern>
</filter-mapping>


It will be great if someone can please help me.

Thanks,
MANJIT
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: JSP with large data loads slow in Solaris10 OS , we are using Jrun4 and nested logic:iterate