Pablo Abbate

Ranch Hand
+ Follow
since Aug 06, 2012
Pablo likes ...
Spring Java Ubuntu
Merit badge: grant badges
For More
Cows and Likes
Total received
In last 30 days
Total given
Total received
Received in last 30 days
Total given
Given in last 30 days
Forums and Threads
Scavenger Hunt
expand Ranch Hand Scavenger Hunt
expand Greenhorn Scavenger Hunt

Recent posts by Pablo Abbate

Tony Docherty wrote:

kri shan wrote:Which give better performance for counter

What do you mean by better performance, is it accuracy, speed or something else altogether?

Pablo Abbate wrote:If you have multithreads, use AtomicInteger, if not, use the common increment.

I think the decision is a bit more complex than that.
It may be that the reads and writes are currently done from within synchronized blocks for other reasons we are not aware of or it maybe that the hit counter is only indicative and so (providing writes are done from within a synchronized block) occasionally being out by a few hits on reads may not be an issue.

I mean if the counter is being (or can be) updated with multiple threads.
If the counter is inside a synchronized block, you don't have multithreads accessing the counter.
11 years ago

kri shan wrote:Which give better performance for counter(how many hits for my site), Atomic Integer or increment (++)

If you have multithreads, use AtomicInteger, if not, use the common increment.
11 years ago
Although this for linux, could be useful.

Cloudera Hadoop
11 years ago

Nikhil Das Nomula wrote:Thank you Pablo. That helps. Another question which I have is

I am sure that we can call a map-reduce job from a normal java application. Now the map-reduce jobs in my case has to deal with files on hdfs and also files on other filesystem. Is it possible in hadoop that we can access files from other file system while simultaneously using the files on hdfs. Is that possible ?

So basically my intention is that I have one large file which I want to put it in HDFS for parallel computing and then compare the blocks of this file with some other files(which I do not want to put in HDFS because they need to be accessed as full length file at once.

The hadoop architecture is oriented to have several nodes working in parallel. And the HDFS automatically replicates data when its necessary, dealing with the concurrent access, etc,etc.
So, if you want to access your "external" file from hadoop you should model the access as if was a database. A unique entry point, managing concurrent access, etc.
11 years ago
The Map-Reduce approach requires the use of Keys/Values.
If you want to process some tasks in parallel you should be able to split the work somehow. So you need to define a key and value.

For example, each row of your csv could have ...
id, user, type, etc, etc, etc

So you may use id as a key, and the entire row as a value.

Hope it helps you,
11 years ago
This kind of errors happen when you have an api-jar and an impl-jar with misplaced versions.

For example.


Please check that.
11 years ago
Are you using spring framework, if not please use Resource Bundle and put the properties file into web-inf classes folder.

11 years ago
It has been a lot of time since I worked with the UI. But I remember a few things ...

If you wan't to hold the state of the checkboxes, when you paginate, you need to submit the values. (form)
Another way to do it, is an ajax method while you select an individual checkbox
If you unselect a check, the value doesn't go to the server, so I remember I needed to add a hidden tag with value=false.

Hope it helps.
11 years ago
Some of the question I have asked are

What is IoC? What does allows to do?
Why would you use a prototype bean?
Differences between autowired and resource.
Factory method.

11 years ago
I have worked with accurev and faced a lot of issues. The workflow idea is great, but there were some problems with the sync. Why not GIT? you can build a workflow too.

Paul Clapham wrote:

Pablo Abbate wrote:We are not discussing best practices, he wants to know what can he do with JSP ...

But this is not some game where the task is to identify as many different ways to use a JSP as possible. The poster wants to learn about JSP, presumably so that he or she can use it to write web applications. Given that, the responsible thing is to identify best practices before the poster starts learning the less-than-best practices which are so lamentably common.

I disagree with you. We're not discussing your ironic "game". He don't know nothing about JSP, so a very simple advise is to think about it as a mixture between java, jsp tags, html tags. I don't know if you have experience teaching, but you should start using what the people already know in order to create new concepts.

for me the best it's start explaining what you CAN do, and then, what is the best.
11 years ago

Bear Bibeault wrote:

Pablo Abbate wrote:You can think the JSPs files like a mixture of JAVA+HTML+ JSP TAGS.

Not quite. Putting Java in a JSP has been obsolete and discredited for 10 years. No Java code in JSPs!

We are not discussing best practices, he wants to know what can he do with JSP ...
11 years ago