开发者

Import MySQL database into a SQL Server

I have a .sql file from a MySQL dump containing tables, definitions and data to be inserted in these tables. How can I convert this database represented in t开发者_运维知识库he dump file to a SQL Server database?


Use SQL Server Migration Assistant (SSMA)

In addition to MySQL it supports Oracle, Sybase and MS Access.

It appears to be quite smart and capable of handling even nontrivial transfers. It also got some command line interface (in addition to GUI) so theoretically it can be integrated into some batch load process.

This the current download link for MySQL version https://www.microsoft.com/en-us/download/details.aspx?id=54257

The current (June 2016) stable version 6.0.1 crashes with the current (5.3.6) MySQL ODBC driver while transferring data. Everything 64 bit. The 5.3 version with the 5.1.13 ODBC driver works fine.


I suggest you to use mysqldump like so:

mysqldump --compatible=mssql

phpMyAdmin is still a web-app and could potentially have some limitations for large databases (script execution time, allocatable memory and so on).


I found a way for this on the net

It demands a little bit of work, because it has to be done table by table. But anyway, I could copy the tables, data and constraints into a SQL Server database.

Here is the link

http://www.codeproject.com/KB/database/migrate-mysql-to-mssql.aspx


If you do an export with PhpMyAdmin, you can switch sql compatibility mode to 'MSSQL'. That way you just run the exported script against your MS SQL database and you're done.

If you cannot or don't want to use PhpMyAdmin, there's also a compatibility option in mysqldump, but personally I'd rather have PhpMyAdmin do it for me.


Here is my approach for importing .sql files to SQL Server:

  1. Export table from MySQL with --compatible=mssql and --extended-insert=FALSE options:

    mysqldump -u [username] -p --compatible=mssql --extended-insert=FALSE db_name table_name > table_backup.sql

  2. Split the exported file with PowerShell by 300000 lines per file:

    $i=0; Get-Content exported.sql -ReadCount 300000 | %{$i++; $_ | Out-File out_$i.sql}

  3. Run each file in SQL Server Management Studio

There are few tips how to speed up the inserts.

Other approach is to use mysqldump –where option. By using this option you can split your table on any condition which is supported by where SQL clause.


I had a very similar issue today - I needed to copy a big table(5 millions rows) from MySql into MS SQL.

Here are the steps I've done(under Ubuntu Linux):

  1. Created a table in MS SQL which structure matches the source table in MySql.

  2. Installed MS SQL command line: https://learn.microsoft.com/en-us/sql/linux/sql-server-linux-setup-tools#ubuntu

  3. Dumped table from MySql to a file:

mysqldump \
    --compact \
    --complete-insert \
    --no-create-info \
    --compatible=mssql \
    --extended-insert=FALSE \
    --host "$MYSQL_HOST" \
    --user "$MYSQL_USER" \
    -p"$MYSQL_PASS" \
    "$MYSQL_DB" \
    "$TABLE" > "$FILENAME"
  1. In my case the dump file was quite large, so I decided to split it into a number of small pieces(1000 lines each) - split --lines=1000 "$FILENAME" part-

  2. Finally I iterated over these small files, did some text replacements, and executed the pieces one by one against MS SQL server:

export SQLCMD=/opt/mssql-tools/bin/sqlcmd

x=0

for file in part-*
do
  echo "Exporting file [$file] into MS SQL. $x thousand(s) processed"

  # replaces \' with ''
  sed -i "s/\\\'/''/g" "$file"

  # removes all "
  sed -i 's/"//g' "$file"

  # allows to insert records with specified PK(id)
  sed -i "1s/^/SET IDENTITY_INSERT $TABLE ON;\n/" "$file"

  "$SQLCMD" -S "$AZURE_SERVER" -d "$AZURE_DB" -U "$AZURE_USER" -P "$AZURE_PASS" -i "$file"
  echo ""
  echo ""

  x=$((x+1))
done

echo "Done"

Of course you'll need to replace my variables like $AZURE_SERVER, $TABLE , e.t.c. with yours.

Hope that helps.


For me it worked best to export all data with this command:

mysqldump -u USERNAME -p --all-databases --complete-insert --extended-insert=FALSE --compatible=mssql > backup.sql

--extended-insert=FALSE is needed to avoid mssql 1000 rows import limit.

I created my tables with my migration tool, so I'm not sure if the CREATE from the backup.sql file will work.

In SQL Server's SSMS I had to imported the data table by table with the IDENTITY_INSERT ON to write the ID fields:

SET IDENTITY_INSERT dbo.app_warehouse ON;
GO 
INSERT INTO "app_warehouse" ("id", "Name", "Standort", "Laenge", "Breite", "Notiz") VALUES (1,'01','Bremen',250,120,'');
SET IDENTITY_INSERT dbo.app_warehouse OFF;
GO 

If you have relationships you have to import the child first and than the table with the foreign key.


Also you can use 'ODBC' + 'SQL Server Import and Export Wizard'. Below link describes it: https://www.mssqltips.com/sqlservertutorial/2205/mysql-to-sql-server-data-migration/

Import MySQL database into a SQL Server


Run:

mysqldump -u root -p your_target_DB --compatible=mssql > MSSQL_Compatible_Data.sql

Do you want to see a process bar?

pv mysqldump -u root -p your_target_DB --compatible=mssql > MSSQL_Compatible_Data.sql


You can use sqlie application for converting from MySQL to SQL Server you can watch this video https://www.youtube.com/watch?v=iTVEqys_vTQ&t=108s

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜