Removing columns in SQL file
I have a big SQL file (~ 200MB) with lots of INSERT
instructions:
insert into `films_genres`
(`id`,`film_id`,`genre_id`,`num`)
开发者_StackOverflow社区 values
(1,1,1,1),
(2,1,17,2),
(3,2,1,1),
...
How could I remove or ignore columns id
, num
in the script?
Easiest way might be to do the full insert into a temporary holding table and then insert the desired columns into the real table from the holding table.
insert into `films_genres_temp`
(`id`,`film_id`,`genre_id`,`num`)
values
(1,1,1,1),
(2,1,17,2),
(3,2,1,1),
...
insert into `films_genres`
(`film_id`,`genre_id`)
select `film_id`,`genre_id`
from `films_genres_temp`
CREATE TABLE #MyTempTable (id int,film_id smallint, genre_id int, num int)
INSERT INTO #MyTempTable (id
,film_id
,genre_id
,num
)
[Data goes here]
insert into films_genres (film_id
,genre_id
) select film_id
,genre_id
from #MyTempTable
drop table #MyTempTable
This Perl one-liner should do it:
perl -p -i.bak -e 's/\([^,]+,/\(/g; s/,[^,]+\)/\)/g' sqlfile
It edits the file in place, but creates a backup copy with the extension .bak.
Or if you prefer Ruby:
ruby -p -i.bak -e 'gsub(/\([^,]+,/, "("); gsub/, ")");' sqlfile
精彩评论