Is your SQL query taking way too much time to give a result? Are you wondering why is my query taking so much time than before? Are you blaming the database or DBA team for
Is your SQL query taking way too much time to give a result? Are you wondering why is my query taking so much time than before? Are you blaming the database or DBA team for your queries Performance?
Then we are in the same boat and let’s get it straight, it is not the fault of your increasing data (which will happen over time), DataBase nor your DBA team its pretty much your fault why the query is taking too much time. you didn’t plan your query properly.
SQL may be simple to write but unlike other programming languages, it requires planning and understanding of your application for writing your code.
Today lets discuss on improving the Performance of your queries.
I am assuming that you have a fully working Query which is taking more time than desired for executing and providing a result. Also, I would be assuming that you understand how a query is decoded by SQL and executed <Not sure click here>.
1. Pre-Aggregating / Pre Processing the Data:
When we know that we have Query which is taking too much time then chances are you either have huge data in your tables or have complex joins. In both the cases the best thing to do is instead of throwing the entire tables and complex join at once divide them into smaller sets and store them in a
Let’s take the below code as an example. Let’s assume that each of the tables contains a minimum of 100 million records. If we try the below code the chances are it will take a
SELECT * FROM Employees E INNER JOIN Departments D ON E.DepartmentId = D.id INNER JOIN Cars C ON E.id = C.EmployeeId INNER JOIN Customers CU ON C.CustomerId = CU.id WHERE E.id<100000 AND CU.Id <100000 AND C.TotalCost <1000
Now Let’s Break this code up. We see that we only need Records which are having id<100000 from Employees table. Instead of joining and then restricting the data by where clause. Why not Pre Process the table before and store it in a temp table or CTE and then finally use it in the join (like Below), will not that action alone reduce the
With e_limit as ( Select * from employees E where E.id<100000), C_limit as ( Select * from Cars C where C.TotalCost <1000), CU_limit as ( Select * from Customers CU where CU.Id <100000) select * from e_limit E INNER JOIN C_limit C ON E.id = C.EmployeeId INNER JOIN CU_limit CU ON C.CustomerId = CU.id INNER JOIN Departments D ON E.DepartmentId = D.id
2. Get Rid of Sub Queries:
Yes, don’t use subqueries in Queries which are already complex. As subqueries tend to modify the default order of execution of Queries. It will affect your performance. If you really need to then consider pre Processing with the table which is referring it.
SELECT E.* FROM Employees E INNER JOIN Departments D ON E.DepartmentId = D.id INNER JOIN Cars C ON E.id = C.EmployeeId INNER JOIN Customers CU ON C.CustomerId = C.id WHERE E.id<100000 AND CU.Id <100000 AND C.TotalCost <1000 AND E.ID in (SELECT ID FROM Employees WHERE FNAME LIKE 'J%' )
With e_limit as ( Select * from employees E where E.id<100000 AND E.ID in (SELECT ID FROM Employees WHERE FNAME LIKE 'J%' ))
3. Wildcards and Affects on Processing:
When searching plaintext data, such as names, wildcards create the widest search possible. However, the widest search is also the most inefficient search. So having wildcards before and after a phrase makes the Database to search the entire column for watching the phrase.
SELECT ID FROM Employees WHERE FNAME LIKE '%CHA%'
The below way of using the wildcards is better
SELECT ID FROM Employees WHERE FNAME LIKE 'CHA%'
4. Avoid While loops:
If you have loops in your queries then it is bound to take time. Try to find alternate ways to avoid using them.