Temporary Tables, Table Variables, Bulk Insert, and Import/Export - Textnotes

Temporary Tables, Table Variables, Bulk Insert, and Import/Export


Master advanced data handling in SQL with this beginner-friendly guide. Learn how to use temporary tables, table variables, bulk insert operations, and import/export data in CSV or JSON formats. Practice creating temp tables and bulk inserting multiple records efficiently.

1. Introduction

Advanced data handling techniques help manage large datasets efficiently and allow complex processing without affecting main tables.

  1. Use temporary tables and table variables for intermediate storage.
  2. Bulk insert and import/export operations make data migration and reporting easier.

Key Points:

  1. Temporary structures exist only during session execution.
  2. Bulk operations are faster than individual inserts for large data.
  3. Import/export supports integration with external systems.

2. Temporary Tables

Temporary tables store data temporarily for session-specific processing.

Syntax:


CREATE TABLE #TempEmployees (
ID INT,
Name VARCHAR(50),
Salary DECIMAL(10,2)
);

Insert data into temp table:


INSERT INTO #TempEmployees (ID, Name, Salary)
VALUES (1, 'John', 50000),
(2, 'Jane', 60000);

Query temp table:


SELECT * FROM #TempEmployees;

Drop temp table:


DROP TABLE #TempEmployees;

3. Table Variables

Table variables are similar to temporary tables but exist only in the scope of a batch or procedure.

Example:


DECLARE @Employees TABLE (
ID INT,
Name VARCHAR(50),
Salary DECIMAL(10,2)
);

INSERT INTO @Employees (ID, Name, Salary)
VALUES (1, 'Alice', 70000),
(2, 'Bob', 55000);

SELECT * FROM @Employees;

Key Difference: Table variables do not require explicit DROP and have less overhead for small datasets.

4. Bulk Insert Operations

Bulk insert is used to insert large volumes of data efficiently.

Example (SQL Server):


BULK INSERT Employees
FROM 'C:\Data\Employees.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW = 2
);

Key Points:

  1. CSV files are most common; JSON or XML also supported in modern SQL.
  2. Faster than multiple single-row inserts.
  3. Useful for ETL (Extract, Transform, Load) operations.

5. Data Import / Export

5.1 Export to CSV


SELECT ID, Name, Salary
INTO OUTFILE 'C:/Data/Employees.csv'
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
FROM Employees;

5.2 Import from JSON (SQL Server Example)


SELECT *
FROM OPENJSON(@json)
WITH (
ID INT,
Name NVARCHAR(50),
Salary DECIMAL(10,2)
);

6. Practical Exercises

  1. Create a temporary table for processing employee salary updates.
  2. Create a table variable and perform calculations like total salary.
  3. Perform a bulk insert of 1000+ employee records from CSV.
  4. Export employee data to CSV or JSON.
  5. Compare performance of table variables vs temp tables for large datasets.

7. Tips for Beginners

  1. Use temporary tables for intermediate computations in queries or stored procedures.
  2. Use table variables for small datasets inside procedures or functions.
  3. Bulk insert is essential for loading large datasets efficiently.
  4. Always validate imported data to avoid corrupting tables.
  5. Combine temp tables, bulk insert, and stored procedures for ETL pipelines.


Next Step: After mastering advanced data handling, the next module is Security and Permissions in SQL, where you’ll learn to manage users, roles, grants, and prevent SQL injection.