How to export a MySQL database to JSON?
I am interested in exporting a subset of values from a MySQL database into a JSON-formatted file on disk.
I found a link that talks about a possible way to do this: http://www.thomasfrank.se/mysql_to_json.html
... but when I use the method from that page, it seems to work but with two problems:
1) It only returns around 15 results, with the last one abruptly cut off (incomplete). My standard query for this returns around 4000 results when I just run it as SELECT name, email FROM students WHERE enrolled IS NULL
But when I run it as:
SELECT
CONCAT("[",
GROUP_CONCAT(
CONCAT("{name:'",name,"'"),
CONCAT(",email:'",email,"'}")
)
,"]")
AS json FROM students WHERE enrolled IS NULL;
... as described in the link, it only returns (as I mentioned) 15 results. (fwiw, I checked these results against the 4000 I'm supposed to get, and these 15 are the开发者_开发知识库 same as the first 15 of the 4000)
2) There seem to be "escape" characters included in the actual file when I add INTO OUTFILE '/path/to/jsonoutput.txt' FIELDS TERMINATED BY ','
to the end of that query. So commas end up looking like '\,' when obviously I would just like to have the commas without the \.
Any ideas on how to get proper JSON output from MySQL? (Either using this method, or some other method)?
Thanks!
If you have Ruby, you can install the mysql2xxxx gem (not the mysql2json gem, which is a different gem):
$ gem install mysql2xxxx
and then run the command
$ mysql2json --user=root --password=password --database=database_name --execute "select * from mytable" >mytable.json
The gem also provides mysql2csv
and mysql2xml
. It's not as fast as mysqldump, but also doesn't suffer from some of mysqldump's weirdnesses (like only being able to dump CSV from the same computer as the MySQL server itself)
It may be asking too much of MySQL to expect it to produce well formed json directly from a query. Instead, consider producing something more convenient, like CSV (using the INTO OUTFILE '/path/to/output.csv' FIELDS TERMINATED BY ','
snippet you already know) and then transforming the results into json in a language with built in support for it, like python or php.
Edit python example, using the fine SQLAlchemy:
class Student(object):
'''The model, a plain, ol python class'''
def __init__(self, name, email, enrolled):
self.name = name
self.email = email
self.enrolled = enrolled
def __repr__(self):
return "<Student(%r, %r)>" % (self.name, self.email)
def make_dict(self):
return {'name': self.name, 'email': self.email}
import sqlalchemy
metadata = sqlalchemy.MetaData()
students_table = sqlalchemy.Table('students', metadata,
sqlalchemy.Column('id', sqlalchemy.Integer, primary_key=True),
sqlalchemy.Column('name', sqlalchemy.String(100)),
sqlalchemy.Column('email', sqlalchemy.String(100)),
sqlalchemy.Column('enrolled', sqlalchemy.Date)
)
# connect the database. substitute the needed values.
engine = sqlalchemy.create_engine('mysql://user:pass@host/database')
# if needed, create the table:
metadata.create_all(engine)
# map the model to the table
import sqlalchemy.orm
sqlalchemy.orm.mapper(Student, students_table)
# now you can issue queries against the database using the mapping:
non_students = engine.query(Student).filter_by(enrolled=None)
# and lets make some json out of it:
import json
non_students_dicts = ( student.make_dict() for student in non_students)
students_json = json.dumps(non_students_dicts)
Another possibility is using the MySQL Workbench.
There is a JSON export option at the object browser context menu and at the result grid menu.
More information on MySQL documentation: Data export and import.
HeidiSQL allows you to do this as well.
Highlight any data in the DATA tab, or in the query result set... then right click and select Export Grid Rows option. This option then allows you can export any of your data as JSON, straight into clipboard or directly to file:
THis is somthing that should be done in the application layer.
For example, in php it is a s simple as
Edit Added the db connection stuff. No external anything needed.
$sql = "select ...";
$db = new PDO ( "mysql:$dbname", $user, $password) ;
$stmt = $db->prepare($sql);
$stmt->execute();
$result = $stmt->fetchAll();
file_put_contents("output.txt", json_encode($result));
I know this is old, but for the sake of somebody looking for an answer...
There's a JSON library for MYSQL that can be found here You need to have root access to your server and be comfortable installing plugins (it's simple).
1) upload the lib_mysqludf_json.so into the plugins directory of your mysql installation
2) run the lib_mysqludf_json.sql file (it pretty much does all of the work for you. If you run into trouble just delete anything that starts with 'DROP FUNCTION...')
3) encode your query in something like this:
SELECT json_array(
group_concat(json_object( name, email))
FROM ....
WHERE ...
and it will return something like
[
{
"name": "something",
"email": "someone@somewhere.net"
},
{
"name": "someone",
"email": "something@someplace.com"
}
]
Another solution, if you are using Ruby, is to write a connection script to the database with ActiveRecord. You will need to install it first
gem install activerecord
# ruby ./export-mysql.rb
require 'rubygems'
require 'active_record'
ActiveRecord::Base.establish_connection(
:adapter => "mysql",
:database => "database_name",
:username => "root",
:password => "",
:host => "localhost"
)
class Event < ActiveRecord::Base; end
class Person < ActiveRecord::Base; end
File.open("events.json", "w") { |f| f.write Event.all.to_json }
File.open("people.json", "w") { |f| f.write Person.all.to_json }
You can also add methods to the ActiveRecord classes if you want to manipulate data first or include or exclude certain columns.
Person.all.to_json(:only => [ :id, :name ])
With ActiveRecord you are not limited to JSON. You can just as easily export as XML or YAML
Person.all.to_xml
Person.all.to_yaml
You are not limited to MySQL. Any database supported by ActiveRecord (Postgres, SQLite3, Oracle... etc).
And it's worth mentioning you could open another handle to a database
require 'active_record'
ActiveRecord::Base.configurations["mysql"] = {
:adapter => 'mysql',
:database => 'database_name',
:username => 'root',
:password => '',
:host => 'localhost'
}
ActiveRecord::Base.configurations["sqlite3"] = {
:adapter => 'sqlite3',
:database => 'db/development.sqlite3'
}
class PersonMySQL < ActiveRecord::Base
establish_connection "mysql"
end
class PersonSQLite < ActiveRecord::Base
establish_connection "sqlite3"
end
PersonMySQL.all.each do |person|
PersonSQLite.create(person.attributes.except("id"))
end
Here is a quick little blog post about it http://www.seanbehan.com/how-to-export-a-mysql-database-to-json-csv-and-xml-with-ruby-and-the-activerecord-gem
You can export any SQL query into JSON directly from PHPMyAdmin
Using MySQL Shell you can out put directly to JSON using only terminal
echo "Your SQL query" | mysqlsh --sql --result-format=json --uri=[username]@localhost/[schema_name]
as described in the link, it only returns (as I mentioned) 15 results. (fwiw, I checked these results against the 4000 I'm supposed to get, and these 15 are the same as the first 15 of the 4000)
That's because mysql restricts the length of the data returned by group concat to the value set in @@group_concat_max_len as soon as it gets to the that amount it truncates and returns what it's gotten so far.
You can set @@group_concat_max_len in a few different ways. reference The mysql documentation...
Also, If you are exporting in application layer don't forget to limit results. For example if you've 10M rows, you should get results part by part.
Use the following ruby code
require 'mysql2'
client = Mysql2::Client.new(
:host => 'your_host', `enter code here`
:database => 'your_database',
:username => 'your_username',
:password => 'your_password')
table_sql = "show tables"
tables = client.query(table_sql, :as => :array)
open('_output.json', 'a') { |f|
tables.each do |table|
sql = "select * from `#{table.first}`"
res = client.query(sql, :as => :json)
f.puts res.to_a.join(",") + "\n"
end
}
This might be a more niche answer but if you are on windows and MYSQL Workbench you can just select the table you want and click Export/Import in the Result grid. This will give you multiple format options including .json
The simplest solution I found was combination of mysql
and jq
commands with JSON_OBJECT
query. Actually jq
is not required if JSON Lines format is good enough.
Dump from remote server to local file example.
ssh remote_server \
"mysql \
--silent \
--raw \
--host "" --port 3306 \
--user "" --password="" \
table \
-e \"SELECT JSON_OBJECT('key', value) FROM table\" |
jq --slurp --ascii-output ." \
> dump.json
books
table example
+----+-------+
| id | book |
+----+-------+
| 1 | book1 |
| 2 | book2 |
| 3 | book3 |
+----+-------+
Query would looks like:
SELECT JSON_OBJECT('id', id, 'book', book) FROM books;
dump.json
output
[
{
"id": "1",
"book": "book1"
},
{
"id": "2",
"book": "book2"
},
{
"id": "3",
"book": "book3"
}
]
If anyone makes it to here looking for an answer in 2021, this is the way to do it with the MySql Shell.
https://dev.mysql.com/doc/mysql-shell/8.0/en/mysql-shell-json-output.html
It's a JS engine to connect and use mysql from console, very nice but didn't exist 11 years ago
For anyone that wants to do this using Python, and be able to export all tables without predefining field names etc, I wrote a short Python script for this the other day, hope someone finds it useful:
from contextlib import closing
from datetime import datetime
import json
import MySQLdb
DB_NAME = 'x'
DB_USER = 'y'
DB_PASS = 'z'
def get_tables(cursor):
cursor.execute('SHOW tables')
return [r[0] for r in cursor.fetchall()]
def get_rows_as_dicts(cursor, table):
cursor.execute('select * from {}'.format(table))
columns = [d[0] for d in cursor.description]
return [dict(zip(columns, row)) for row in cursor.fetchall()]
def dump_date(thing):
if isinstance(thing, datetime):
return thing.isoformat()
return str(thing)
with closing(MySQLdb.connect(user=DB_USER, passwd=DB_PASS, db=DB_NAME)) as conn, closing(conn.cursor()) as cursor:
dump = {}
for table in get_tables(cursor):
dump[table] = get_rows_as_dicts(cursor, table)
print(json.dumps(dump, default=dump_date, indent=2))
A lot of these answers seem over-engineered. Armed with only a shell and mysql
you can do this from the command line.
Assuming a database named mydb
it's as simple as
for t in $(mysql -Bse "select table_name from information_schema.tables where table_schema = 'mydb'")
do
mysql -Bse "select concat('select json_object(', group_concat('''', column_name, ''',', column_name), ') from ', table_name, ';') from information_schema.columns where table_schema='mydb' and table_name='$t'" mydb \
| mysql -Bs mydb > $t.json
done
Run this, and afterwards you will have one json file per table, no programming languages needed.
The heart of the answer is the following query from the information_schema
:
select concat(
'select json_object(',
group_concat(
'''',
column_name,
''',',
column_name
),
') from ',
table_name,
';')
from information_schema.columns
where table_schema = 'mydb'
and table_name = ?
精彩评论