nitinram agarwal

Ranch Hand
+ Follow
since Jan 29, 2009
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
0
In last 30 days
0
Total given
0
Likes
Total received
1
Received in last 30 days
0
Total given
0
Given in last 30 days
0
Forums and Threads
Scavenger Hunt
expand Ranch Hand Scavenger Hunt
expand Greenhorn Scavenger Hunt

Recent posts by nitinram agarwal

Hello,
I have Rest service using Spring boot and have POST requests. The POST request takes care of setting up some records and are good to handle upto 100 records or so. However one of the end user has a need to process more than 500 records as part of one POST request and the solution we proposed is to break it in n chunks (for example 5 chunks of 100 records) and make one service call for each chunk. However this approach has issues with transaction management since each service call is one transaction. I am checking with end users if they can let partial failure as doing the transaction management for all records seems complex. I am still exploring the details and wanted to check if there are some good pointers for this scenario to refer to?
Regards
1 year ago
I already have the repository class implemented to support CRUD operation. I am having issue with "pattern search" on  a specific field as posted in the question.  I had a look at the link provided but not sure if I see something specific to the issue I am facing. Can you please point to me if I am missing?
2 years ago
I want to get a list of all persons who have some keywords in their hobbies (for example hockey - case insensitive).

The Java class corresponding to my Mongo collection is


Before saving this document in Mongo collection, I am converting each of the hobby to lowercase.

Per https://docs.mongodb.com/manual/reference/operator/query/in/, such queries are supported using Mongo shell and I am able to get the data using the below command

However using the Spring Mongodb (3.2.4), I am not getting the data using code snippet below


Would like to know if there is a way to get the data using Spring library?
2 years ago
I have a requirement to highlight rows with specific color based on its attribute. For example, let's say I am displaying employee records and all employees whose highest education column is not null then show it with green

I have the existing code as following in the corresponding employee.component.html file



so as of now, only a specific column is colored but the requirement is to color the whole row? I am using Angular 10.x. I am trying to use some kind of logic on starting "tr" tag but its not working.

I am trying to find common elements between 2 lists where elements are of type Object.

For example


similarly


I need to find out the entries from employees which has same first name as that of student.

I did the following




This works, however I am trying to find a solution where common attributes are not limited to a single field (say for example first and last name).

The list can contain many elements but irrespective of the size, I am wondering if a better solution exist using FP.
2 years ago

Paul Clapham wrote:There is only one alternative: process the data in parts instead of loading it all into memory at once. How you would do that depends strongly on the structure of the data, which you haven't said much about. You would like to have an array but it's too big for memory, so you can't have that array. Instead you have to process the data in parts. Again, what those parts might be depend on what the data is.


nitinram agarwal wrote: I have updated by question and provided some details. Please see if it helps

3 years ago
Hello friends,
Hope you are keeping well at this time of pandemic.

I would like the opinion on 2 questions I had in a technical discussion.
While I can think of some possible options, I thought of putting it here so as to get more details. If the question is inappropriate for such forums then please let me know and I will delete it.
The question asked in the technical discussion is (I am not supposed to use any standard java library which handles similar situation and implement something on my own)
1. how do you process an array which does not fit in available memory for JVM
2. how do you process a big file say 20 gb which does not fit in available memory.

I answered the following for 1
a. get the length of the array and process the array using some part of the length(for ex 4 iterations on array with length/4)

For 2 , I told that the original file can be split into multiple parts using split command (or something similar in respective OS environment). Process individual smaller files and generate the intermediate result (for example data aggregation). Once individual file processing is done then based on the final size of the intermediate result files, process all the result file in 1 go or again apply iterative processing.

I think approach for 2 made sense but I am not sure about approach for 1. If there are better alternatives then I want to know. I am generally curious as to what should be the answer.
3 years ago
thanks for the details. I will try to use the standard product as I think doing something inhouse will not be practical on long term
3 years ago
Hello
I have been asked to design a system that monitors the log file on a real time basis and report issues if any (like application failure, threshold breach for specific exception etc). I know that there are some standard tools available for such functionality like Filewatcher from AWS, but my firm does not want to invest in any tool and asked me to develop tools in house with some basic features. My language of choice is java and shell scripting. Can you please advise what should be design approach since the challenges that I think of are the following
1. actively monitoring the log file - this means running a process in parallel to the specific application being monitored to and constantly read the log file. I am not sure what's the best way to read a log file which is being constantly written to
2. passing monitoring the log file - possibly run a program every 30 sec. The program does the following
   2.1. take a snapshot of the log file
   2.2. compare the line count against previously stored snapshot (done 30 sec ago)
  2.3. read the contents of the newly added lines and determine if anything happened. Also possibly maintain state for exception count in a secondary storage
I am open to suggestions for better design and also can choose Python for my work if any such functionalities are easier to implement.
Please advise.

3 years ago
Hello,
It is a legacy application and does the following
1. against some passed in crtieria, invokes the stored procedure
2. pass on the returned data to other application.

I am not fully aware of the end to end flow and application reengineering is definitely not an option as of now (due to overall complexity of the application + effort rerquired for reengineering which requires some budget approval etc).

For this, I am looking for a solution in the existing application.
Hello Everyone,
thanks a lot for your suggestion. I went ahead with object caching and saw an improvement of around 30% in the application (since the application has an uptime of 6 days -starting on Sunday morning and stopping at Saturday morning).

Regards,
8 years ago
Hello,
I have a legacy code which uses JDBC for database connection and calls some procedure which returns dataset. Each row consists of around 30 columns and when the dataset is around 300K rows, my program is failing with Out of memory.

After some analysis, I found that the current program is not able to handle more than 150K records (it runs out of memory). On some basic performance tuning and code clean up, I made this number to 160K (meaning program is failing after fetching 160K rows).

however I am not able to tune it further. The code is usual in nature that it has a ResultSet object to fetch rows.
I am trying to find a way to nullify a specific resultSet row once it is processed but did not find anything in the API. I am trying further to see if there is anything else can be done but feel that if I am able to set the current resultSet position to null after it is processed then there should be some more improvement.

Can anyone please suggest if there is a way to do this? I am trying in parallel to find if there is some alternative using Spring JDBC as well (but this requires more development effort and some code reengineering so getting approval for this approach will be time consuming).

Regards
Hello,
Thanks for your reply. Object creation takes around 10-20 milli second but the class dsign will not allow my approach since there are some instance variables.
8 years ago