06 September,2016 by Tom Collins
There isn’t a limit to the number of NULLs allowed in data sets but having more NULLs than you think you need may be a sign that it’s time to review.
Some applications require a high amount of NULLs. It is considered to be acceptable and is a built-in expectation. For example, I was working on a large ETL process and the staging server allowed NULLs as it fused data from different sources. This worked fine. But there were clean up processes which deal with the NULLs . There were post Import scripts which dealt with NULLs in preparation for generating reports
One of the major issues with allowing NULLs is the programming side can get inefficient. Some standard issues include having to construct WHERE clauses that need IS NULL or IS NOT NULL. These can play havoc with the Optimizer, leading to inefficient Execution Plans.
Another issue is built in functions . How will built in functions deal with NULLs?
Null values don’t index effectively. A search against a column with many NULLs can lead to slow response times. There are workarounds – various techniques , filtered statistics and indexes. The drawback for relying on filters is that they are useful for a search that finds somethings. Filters won't help searches that are looking for NULL values.
SQL Server - How to improve Execution Plan Reuse (SQL Server ...
Using NULL as an ordinary value in sql code - SQL Antipattern # 001
This is only a preview. Your comment has not yet been posted.
As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.
Having trouble reading this image? View an alternate.
Posted by: |