Recently we have discovered that with hight volume of traffic for Oracle 10g database, we were running into Data fragmentation issues because we are storing Blob that avgs 140k. The issue as I understood,The way application works is whenever user comes to site & browse thru we insert/update their session with pertinent info, after 20mins of idle time blob will be deleted. The issue as I understood, when we do delete it's removing Blobs and creating holes, next insert/update scaning the whole table till it finds appropriate hole, which is causing slowness. It turned out to be interal Oracle Bug, so we were looking at other alternatives to save this huge blob. I heard JCS, In-Memory caching mechanisms, we are evaluating which one to use or change the whole application to minimize Blob. My question is, How do you deal with session management for huge blob, hoping that someone could shed light on the best practise.