How to move tables from one sql server database to another?
We have a database that has grown to about 50GB and we want to pull out a certain set of tables (about 20 of them) from within that database and move them into a new database. All of this would be on the same SQL Server. The tables that we want to pull out are about 12GB of space (6GB data, 6GB indexes).
How can we move the tables from one database to the second but make sur开发者_开发知识库e the tables that are created in the new database are an exact copy of the originals (indexes, keys, etc.)? Ideally I want a copy/paste from within SQL Server Management Studio but I know this does not exist, so what are my options?
To do this really easily with SQL Server 2008 Management Studio:
1.) Right click on the database (not the table) and select Tasks -> Generate Scripts
2.) Click Next on the first page
3.) If you want to copy the whole database, just click next. If you want to copy specific tables, click on "Select Specific Database Objects", select the tables you want, and then click next.
4.) Select "Save to Clipboard" or "Save to File". IMPORTANT: Click the Advanced button next to "Save to File", find "Types of data to script", and change "Schema only" to "Schema and data" (if you want to create the table) or "Data only" (if you're copying data to an existing table). This is also where you'd set other options such as exactly what keys to copy, etc.
5.) Click through the rest and you're done!
If you're moving the tables to a whole new database just because of growth, you might be better off considering using filegroups in your existing database instead. There will be a lot fewer headaches going forward than trying to deal with two separate databases.
EDIT
As I mentioned in my comments below, if you truly need a new database, depending on the total number of tables involved, it might be easier to restore a backup of the database under the new name and drop the tables you don't want.
I did also find this potential solution using SQL Server Management Studio. You can generate the scripts for the specific tables to move and then export the data using the Generate Scripts Wizard and Import/Export Wizard in SQL Server Management Studio. Then on the new database you would run the scripts to create all of the objects and then import the data. We are probably going to go with the backup/restore method as described in @Joe Stefanelli's answer but I did find this method and wanted to post it for others to see.
To generate the sql script for the objects:
- SQL Server Management Studio > Databases > Database1 > Tasks > Generate Scripts...
- The SQL Server Scripts Wizard will start and you can choose the objects and settings to export into scripts
- By default the scripting of Indexes and Triggers are not included so make sure to trun these on (and any others that you are interested in).
To export the data from the tables:
- SQL Server Management Studio > Databases > Database1 > Tasks > Export Data...
- Choose the source and destination databases
- Select the tables to export
- Make sure to check the Identity Insert checkbox for each table so that new identities are not created.
Then create the new database, run the scripts to create all of the objects, and then import the data.
If you like/have SSIS you can explore using the Copy SQL Objects Task component to do this.
Try DBSourceTools.
http://dbsourcetools.codeplex.com.
This toolset uses SMO to script tables and data to disk, and also allows you to select which tables / views / Stored procedures to include.
When using a "deployment target", it will also automatically handle dependencies.
I have used it repeatedly for exactly this type of problem, and it's extremely simple and fast.
SELECT *
INTO new_table_name [IN new database]
FROM old_tablename
A lazy, efficient way to do this in T-SQL:
In my case, some of the tables are large, so scripting out the data is impractical.
Also, we needed to migrate just a fraction of an otherwise very large database, so I didn't want to do backup / restore.
So I went with INSERT INTO / SELECT FROM and used information_schema etc to generate the code.
Step 1: create your tables on new DB
For every table you want to migrate to new database, create that table on new database.
Either script out the tables, or use SQL Compare, dynamic sql from information_schema -- many ways to do it. dallin's answer shows one way using SSMS (but be sure to select schema only).
Step 2: create UDF on target DB to produce column list
This is just a helper function used in generation of code.
USE [staging_edw]
GO
CREATE FUNCTION dbo.udf_get_column_list
(
@table_name varchar(8000)
)
RETURNS VARCHAR(8000)
AS
BEGIN
DECLARE @var VARCHAR(8000)
SELECT
@var = COALESCE(@var + ',', '', '') + c.COLUMN_NAME
FROM INFORMATION_SCHEMA.columns c
WHERE c.TABLE_SCHEMA + '.' + c.TABLE_NAME = @table_name
AND c.COLUMN_NAME NOT LIKE '%hash%'
RETURN @var
END
Step 3: create log table
The generated code will log progress into this table so you can monitor. But you have to create this log table first.
USE staging_edw
GO
IF OBJECT_ID('dbo.tmp_sedw_migration_log') IS NULL
CREATE TABLE dbo.tmp_sedw_migration_log
(
step_number INT IDENTITY,
step VARCHAR(100),
start_time DATETIME
)
Step 4: generate migration script
Here you generate the T-SQL that will migrate the data for you. It just generates INSERT INTO / SELECT FROM statements for every table, and logs its progress along the way.
This script does not actually modify anything. It just outputs some code, which you can inspect before executing.
USE staging_edw
GO
-- newline characters for formatting of generated code
DECLARE @n VARCHAR(100) = CHAR(13)+CHAR(10)
DECLARE @t VARCHAR(100) = CHAR(9)
DECLARE @2n VARCHAR(100) = @n + @n
DECLARE @2nt VARCHAR(100) = @n + @n + @t
DECLARE @nt VARCHAR(100) = @n + @t
DECLARE @n2t VARCHAR(100) = @n + @t + @t
DECLARE @2n2t VARCHAR(100) = @n + @n + @t + @t
DECLARE @3n VARCHAR(100) = @n + @n + @n
-- identify tables with identity columns
IF OBJECT_ID('tempdb..#identities') IS NOT NULL
DROP TABLE #identities;
SELECT
table_schema = s.name,
table_name = o.name
INTO #identities
FROM sys.objects o
JOIN sys.columns c on o.object_id = c.object_id
JOIN sys.schemas s ON s.schema_id = o.schema_id
WHERE 1=1
AND c.is_identity = 1
-- generate the code
SELECT
@3n + '-- ' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME,
@n + 'BEGIN TRY',
@2nt + IIF(i.table_schema IS NOT NULL, 'SET IDENTITY_INSERT staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' ON ', ''),
@2nt + 'TRUNCATE TABLE staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME,
@2nt + 'INSERT INTO staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' WITH (TABLOCKX) ( ' + f.f + ' ) ',
@2nt + 'SELECT ' + f.f + + @nt + 'FROM staging.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME,
@2nt + IIF(i.table_schema IS NOT NULL, 'SET IDENTITY_INSERT staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' OFF ', ''),
@2nt + 'INSERT INTO dbo.tmp_sedw_migration_log ( step, start_time ) VALUES ( ''' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' inserted successfully'', GETDATE() );' ,
@2n + 'END TRY',
@2n + 'BEGIN CATCH',
@2nt + 'INSERT INTO dbo.tmp_sedw_migration_log ( step, start_time ) VALUES ( ''' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' FAILED'', GETDATE() );' ,
@2n + 'END CATCH'
FROM INFORMATION_SCHEMA.tables t
OUTER APPLY (SELECT f = staging_edw.dbo.udf_get_column_list(t.TABLE_SCHEMA + '.' + t.TABLE_NAME)) f
LEFT JOIN #identities i ON i.table_name = t.TABLE_NAME
AND i.table_schema = t.TABLE_SCHEMA
WHERE t.TABLE_TYPE = 'base table'
Step 5: run the code
Now you just copy the output from step 4, paste into new query window, and run.
Notes
- In step 1, I exclude hash columns from the column list (in the UDF) because those are computed columns in my situation
精彩评论