This week's book giveaway is in the OCPJP forum. We're giving away four copies of OCA/OCP Java SE 7 Programmer I & II Study Guide and have Kathy Sierra & Bert Bates on-line! See this thread for details.
Up to you really. The size of the application doesn't particularaly matter - I'd personally use Hibernate for most DAO logic these days, unless the application was very basic, or a bulk loader.
Its not the volume of data that should be driving your decision, its portability and ease of use. If you go with JDBC, you will have to write all the nuts and bolts to convert from entities to objects yourself. No biggy if there are not many entities. But if you ever have to change to a different database Hibernate is a godsend.
Whatever you choose, assuming you use a data access pattern you should be able to swap your persistence technology easilly enough.
O/R mapping is well suited for read --> modify --> write centric applications and not suited for write centric applications (i.e. batch processes with large data sets like 5000 rows or more) where data is seldom read. Although this was generally true of many earlier O/R mapping frameworks, most today (including Hibernate) allow for efficient ways of performing large batch style write operations.