A public forum for discussing the design of software, from the user interface to the code architecture. Now closed.
We often design DB models following the books (no redundancy, normalized etc.) Result is speed of inserts, but not so good when it comes for searcing (which 90% of the users do anyway)
I was wondering if there is a tool that will take set of dependent normalized tables and suggest/create flattened out model that represents same data in denormalized form (where search is fast)
apparenlty built in text search of SQL server is not an option
Wednesday, October 29, 2008
I don't know of a tool for this, but I used star schema design to good effect when building databases whose main purpose is supporting queries. These second level databases are sometimes called "data warehouses", but that's stretching the term. More commonly, they are called things like "reporting databases" or "data marts".
There are some people who have made their data marts by building views on the normalized DB. I never tried this. I had a star schema, and used ETL techniques to get the data out of the normalized OLTP DB and into the star schema.
If you were brought up in the true religion of normalization, star schema looks moronic when you first see it. It takes a little bit of deprogramming to understand that there is a design discipline here, and that good star schema design is way better than bad star schema design. when you get a good star (or a constellation of stars) queries not only run faster, but also they are far easier to compose.
I once got a Cognos user to build a data cube on top of one of my star schemas, and it turns out that Cognos can figure out what the star schema structure means and can convert it to data cube structure all by itself. The result is that creating useful summaries and charts is like playing a video game.
There are several other products that are just as good as Cognos. I mentioned it by name because that was the one I saw used.
Given that there's many different ways that a database can be denormalised, a tool is going to have trouble picking which option will best suit the specific queries being executed.
The tool you're after is a profiler, which will tell you what queries are the bottlenecks, and then you can denormalise as necessary.
Thursday, October 30, 2008
The speed of you queries has much more to do with indexing than Denormalizing..
Seriously, are you users so advanced that they are joining more than 2-3 tables?
I don’t see Denormalizing as a solution unless you are really looking at massive (and I mean massive) amounts of data joining bunches of tables.
I really don’t know you situation, but my gut tells me you have a user base that is writing queries in MSQuery returning large amounts of data to excel, where badly structured queries and network traffic abuse is the primary culprit.
TO lazy to Log on.
Friday, October 31, 2008
If you are running SQL Server then try the SQL Profiler tool to find out why the searches are slow. Another poster has pointed out you should look at your indexes, check them and check the queries your users are writing. Perhaps they're using 'like' searches which require a table scan. Or you need to optimize/tune your hardware/software. There's plenty of easy things to do before changing table structures.
And you do have auto generate statistics turned on?
99% of the time where I have seen database performance problems with queries it has been because the indexing isn't appropriate for the queries that are being done.
Make sure you have the basics right before considering esoterica.
Sunday, November 09, 2008
This topic is archived. No further replies will be accepted.Other recent topics
Powered by FogBugz