Skanda Raman

Ranch Hand
+ Follow
since Mar 21, 2008
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
In last 30 days
0
Forums and Threads

Recent posts by Skanda Raman

I got the fix. I was catching the exception in catch block which I understood should not be. If I throw the exception from Catch Block, the message is going to DLQ after respective retries.

Please advise how could the below requirements be accomplished through ActiveMQ and Spring JMS.

A) I want to push the messages into DLQ only when respective exception occurs. Currently as per my understanding, any exceptions thrown from Message Listeners are moved to DLQ after specified retry parameters.
B) Could I control the like No-Retry for custom exceptions. For example, if there is an Instance of MyException class thrown, I should not retry those messages such that it should not even go to DLQ.



9 years ago
I tried doing the same. But my poison messages are not getting into DLQ. Here is my configuration xml and Listener/Consumer. Can you please review and advise if any error or mistakes in my code.






Also kindly provide me the answers for this questions

A) I am using activemq 5.5 and in the redelivery policy is not accepting the queue property (see **** comment in XML file). So I tried creating DLQ manually in admin server and re-run the Listener, the poison messages are not getting queued there. Please advise?

B) Should I need to create the DLQ explicity from the code or does activemq creates after error message is retried after some time?
9 years ago
I have a requirement to load messages from two queues and i am using ActiveMQ, I have to implement the Retry mechanism in case of any error or network or application server failure and load back into the same Queue. Also, I want to load any poison messages to DLQ.

Please let me know if I can acheive these through Spring JMS. Also, please advise some good examples to accomplish this task. I checked Spring JMS documentation and have not much details in that.
9 years ago
Please let me know if any Suggestions on direction I have selected for resolving this problem.
11 years ago
Sorry for not posting the complete details.

Issue is more as mentioned in below IBM notes

http://www-01.ibm.com/support/docview.wss?uid=swg21409335

I wanted to pass list of objects to my stored procedure through Springs. Hence, I implemented ArrayDescriptor from oracle package. Now, on running my code on WAS server, server throws below error. I have set up the datasource in WAS admin console and looks like WAS wraps the connection object as WSJdbcConnection, which is not compatible with OracleConnection which ArrayDescriptor requires.



I tried normal typecasting from WSJdbcConnection to OracleConnection and this is not working. My requirement is to run the code on both Tomcat and WAS server.

With that said, after some research on google I found that spring framework provides native jdbc classes and hence I used CommonDbcpNativeJdbcExtractor for Tomcat and WebSphereNativeJdbcExtractor for WAS as below



Please let me know if this is the right approach or am I going on wrong direction.
11 years ago
Hi,

I m calling stored procedure using spring frame work. My application server on production environment ins Web sphere and data source is configured and connection is retrieved through jndi. My code require Connection object to commit the transactions. Im getting below error when executing the application in WAS.



Please let me know the way to avoid this error. My application runs on tomcat in my local machine and WAS in my production machine.
11 years ago
Here is the insert script I am trying to Execute through JdbcBatchItemWriter. I want to use the same seq id in TABLE_B.

INSERT INTO TEST_A(TEST_A.nextval,NAME)

Hope this info helps. Let me know if there is a way in spring/spring batch to do this.

11 years ago

Additionally I wanted to mention that, the PK for TABLE_A is a sequence which generates automatically. Hence, even when I try to use the CompositeItemWriter, I cannot reuse the Object for next table writers.

However, I tried to insert data for TABLE_A initially using simpleJdbcInsert and get the PK value and then use JDBCBatchWriter to write other tables.

This approach seems to be more time consuming which hits the performance when there are huge records.

Appreciate if any thoughts on this.
11 years ago
Hello,

I have a requirement to insert data into two tables using JdbcBatchItemWriter.

I have two tables TABLE_A, TABLE_B.

TABLE_B is child of TABLE_A with Foreign Key Relation. I am successful on inserting data using this batch writer to TABLE_A.

However, due to FK constraint, I cannot know the FK value to insert in TABLE_B.

Please let me know if there is a way to get the primary key values for the records inserted in TABLE_A during the process so that I can build a object and itemwrite into TABLE_B.
11 years ago
I am reading csv files using flatfilereader. I want to read list of files from folder and pass filenames for job to execute and put it in mapper files.
11 years ago
I have requirement to process a flatfile through spring batch job execution. I get the list of files from the folder which needs to be processed. Please let me know how to accomplish this.

For example: I have player.csv and player1.csv and player2.csv in a folder. I have these file names in a list. Please let me know if I can input all these files names for Jobparameters so that I can execute a single Job?

11 years ago
I have parsed the huge XML file using STAX and loaded the information in my POJO's. I used these POJO and committed data to database.

Now, I have a requirement to parse the huge XML file in Chunks. Meaning, parse partially to a threshold limit and commit to database and then again parse the left out. Can we do this using STAX API.

For example, I have 50K entries and want to set threshold limit for parse for 500. In this case, can i parse 500 entries and store in database and then continue parsing remaining.

Please advise.
Adding below information would give some additional information to help.

Here getInsPlanList size is 10 and however getInsuranceList().size() is 2, I want to iterate and insert the record details of getInsPlanList.size().

So in this case do I need to loop this logic to getInsuranceList().size() so that my getBatchSize() can return of getInsPlanList.size().




11 years ago

Hello - I have requirement to execute and insert around 2000 records. Hence I choose to execute through jdbctemplate batch template.

Problem: This is to insert insurance plans for my customer. As a requirement I have many Insurances and it has many plans. So getBatchSize() returns number of insurance available and while setting the values i will get Insurance plan object and insert the data. However, with this line getInsuranceList().get(i).getInsPlan(), I get only one plan.

Please advice, If i have to put all this method in a loop to run available insurances so that i can return the size of the plans available. Is this the correct approach to loop the method.

11 years ago
Thank you Jeanne. You provided the detailed explanation.

I am now struggling with this problem of session invalidation. As you mentioned, please let me know if you are suggesting to set the path in JSESSIONID. I am trying to do as below. Please advise if this is the right approach.

1. Get the list of cookies from request from below code.


2. Loop the cookie array and find JSESSIONID and set path to my application context path and then add to response header.

OR can i create a new cookie from my application and add the path.

Please advise.
11 years ago