First, I agree with the other posters that you want to use the right number of tables to correctly represent the data characteristics of your data set. Proper normalization will dictate the number of tables based on your business model (if it is a business) or your data behavior (if you are tracking something in nature) or whatever.
However, there is a different answer to consider. You have an absolute limit of 255 fields and 2048 total bytes per record in any one table, not counting large binary or text objects. (BLOBs and Memo fields). You want to avoid jam-cramming stuff into a few tables if your action brings you close to either of those limits. Whether your tables are of the "many but shallow" design or the "few but deep" design, you must consider that you will run into limits of one kind or another.
Here is a thought to add to your design issues. Often, you can design a bunch of tables in the "many but shallow" format and then, where appropriate for your situation, build queries that APPEAR to be "few but deep." Don't forget that tables and queries typically have one thing in common - they both produce recordsets usable in other queries, forms, reports, and VBA code that uses ADO or DAO operations on recordsets.