开发者

Powershell pipe file contents into application without loading file in memory

With cmd I'd run 开发者_如何学Cmysql -uroot database < filename.sql to import a database dump (read from file and pass to MySQL). However, < is "reserved" in powershell.

Instead, in powershell I use get-content filename.sql | mysql -uroot database. The caveat is that powershell reads filename.sql completely into memory before passing it along to MySQL, and with large database dumps it simply runs out of memory.

Obviously, I could execute this via cmd but I have a handful of powershell scripts automating various tasks like this and I don't want to have to rewrite them all in batch. In this particular case, filename.sql is a variable that's specified via PS parameters when the automation kicks off.

So how do I get around this memory limitation? Is there another way to pipe the file contents into MySQL directly?


You can Try

mysql -uroot -pYourPassword -e "source C:\temp\filename.SQL"

or

mysql --user=root --password=YourPassword --execute="source C:\temp\filename.SQL"

If things start to get complicated maybe you should write a C# Console application that does the complex tasks.


Not sure if this will work for your application or not (it should process the file in chunks of 1000 records at a time, rather than all at once):

get-content filename.sql -readcount 1000 |% {$_ | mysql -uroot database}


I'd say stay away from the cmdlets for large files. I've been doing something similar with files that are 30+ million lines long and have not had a issue by using the below code. It performs extremely well both speed-wise and memory consumption-wise.

$reader = [IO.File]::OpenText($filetoread)
while ($reader.Peek() -ge 0) {
   $line = $reader.ReadLine()

   #do your thing here

}
$reader.Close()


Windows cmd (example for postgre, translate it individually): psql -h 127.0.0.1 -p 5432 -f database.sql -U .... ....

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜