'Reduce execution time while joining two or more tables with massive data

I am trying to join the tables with more than 7,000,000 rows in one table.

I had to join multiple tables and get insights into one query. This is badly reducing my execution time and taking more than 30 min to get the result. I just want to know how can I reduce my execution time.

Here is my code:

SELECT
         j.Id
      , j.JobTypeName
      , j.Title
      , j.CompanyId
      , c.Name
      , j.LocationId
      , l.Country
      , l.State
      , l.City
      , l.ZipCode
      , j.AtsId
      , at.Name
      , j.IndividualAtsCompanyId
      , ic.FeedURI
      , ic.FeedStatus
      , atf.FeedStatusName
      , j.Created
      , j.LastModified
  FROM dbo.Jobs as j with(NOLOCK)
  LEFT JOIN dbo.Ats as at  on j.AtsId=at.Id
  LEFT JOIN dbo.IndividualAtsCompanies as ic  on j.IndividualAtsCompanyId=ic.Id
  LEFT JOIN dbo.AtsFeeds as atf  on ic.FeedStatus=atf.FeedStatus
  LEFT JOIN dbo.Companies as c  on j.CompanyId=c.Id
  LEFT JOIN dbo.Locations as l  on j.LocationId=l.Id


Solution 1:[1]

The following reasons are affecting the execution time:

  1. As SQL is "Declarative Programming Language" developers will not explain how to execute the query to the compiler. The developer need only the results. That's the reason for not creating a smart query to JOIN the tables

  2. Implement indexing for the large data. Because the query will execute in best way and optimize the time when it was implemented using indexing

  3. Read "SQL server execution plans". As there are some pre-defined query rules and plans, we need to know and follow before executing any query.

Implement the query using the indexing. Even though indexing is one of the options, it cannot be considered as a good practice to use such huge dataset for query.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 SairamTadepalli-MT