开发者

Failure parsing JSON with mongoimport

I get an Assertion: 10340:Failure parsin开发者_如何学编程g JSON string error, running mongoimport in pipe over Github API, like the following:

lsoave@ubuntu:~/rails/github/gitwatcher$ curl https://api.github.com/users/lgs/repos | mongoimport -h localhost -d gitwatch_dev -c repo -f repositories
connected to: localhost
 % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                Dload  Upload   Total   Spent    Left  Speed
 0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0Mon Jun 20 00:56:01 Assertion: 10340:Failure parsing JSON string near: [
100 22303  100 22303    0     0  31104      0 --:--:-- --:--:-- --:--:--  111k
0x816d8a1 0x8118814 0x84b357a 0x84b5bb8 0x84adc65 0x84b2ee1 0x60bbd6 0x80f5bc1
mongoimport(_ZN5mongo11msgassertedEiPKc+0x221) [0x816d8a1]
mongoimport(_ZN5mongo8fromjsonEPKcPi+0x3b4) [0x8118814]
mongoimport(_ZN6Import9parseLineEPc+0x7a) [0x84b357a]
mongoimport(_ZN6Import3runEv+0x1a98) [0x84b5bb8]
mongoimport(_ZN5mongo4Tool4mainEiPPc+0x1ce5) [0x84adc65]
mongoimport(main+0x51) [0x84b2ee1]
/lib/tls/i686/cmov/libc.so.6(__libc_start_main+0xe6) [0x60bbd6]
mongoimport(__gxx_personality_v0+0x3f1) [0x80f5bc1]
exception:Failure parsing JSON string near: [
[
...
...
Mon Jun 20 00:45:20 Assertion: 10340:Failure parsing JSON string near: "name": "t
0x816d8a1 0x8118814 0x84b357a 0x84b5bb8 0x84adc65 0x84b2ee1 0x126bd6 0x80f5bc1
mongoimport(_ZN5mongo11msgassertedEiPKc+0x221) [0x816d8a1]
mongoimport(_ZN5mongo8fromjsonEPKcPi+0x3b4) [0x8118814]
mongoimport(_ZN6Import9parseLineEPc+0x7a) [0x84b357a]
mongoimport(_ZN6Import3runEv+0x1a98) [0x84b5bb8]
mongoimport(_ZN5mongo4Tool4mainEiPPc+0x1ce5) [0x84adc65]
mongoimport(main+0x51) [0x84b2ee1]
/lib/tls/i686/cmov/libc.so.6(__libc_start_main+0xe6) [0x126bd6]
mongoimport(__gxx_personality_v0+0x3f1) [0x80f5bc1]
exception:Failure parsing JSON string near: "name": "t
"name": "tentacles"
...
...

see the full trace here: http://pastie.org/2093486. Anyway, the json format I get back from the Github API seems ok ( curl https://api.github.com/users/lgs/repos ):

[
 {
    "open_issues": 0,
    "watchers": 3,
    "homepage": "http://scrubyt.org",
    "language": null,
    "forks": 1,
    "pushed_at": "2009-02-25T22:49:08Z",
    "created_at": "2009-02-25T22:22:40Z",
    "fork": true,
    "url": "https://api.github.com/repos/lgs/scrubyt",
    "private": false,
    "size": 188,
    "description": "A simple to learn and use, yet powerful web scraping toolkit!",
    "owner": {
     "avatar_url": "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2
Fgravatar-140.png",
     "login": "lgs",
     "url": "https://api.github.com/users/lgs",
     "id": 1573
    },
    "name": "scrubyt",
    "html_url": "https://github.com/lgs/scrubyt"
 },
...
...
]

here it is a snippet: http://www.pastie.org/2093524.

If I try specifying csv format it works:

lsoave@ubuntu:~/rails/github/gitwatcher$ curl https://api.github.com/users/lgs/repos | mongoimport -h localhost -d gitwatch_dev -c repo -f repositories --type csv
connected to: localhost
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 22303  100 22303    0     0  23914      0 --:--:-- --:--:-- --:--:--  106k
imported 640 objects
lsoave@ubuntu:~/rails/github/gitwatcher$ 


It worked for me by using "mongoimport --jsonArray ..."


Alright here is what could be going on. First off, I removed all the newlines in the JSON to reduce the number of errors from n (where n = number of lines) to 1. Then it turns out, I had to wrap the JSON Array in another variable and it worked thereafter. I think mongoimport is designed to work with mongoexport, so most likely you cannot use it to import any arbitrary JSON. However, if you want to, what I did woud be something you'd have to do in code before calling the import utility.

I used only 1 record while I was testing. Here is the record with no newlines.

[{"url":"https://api.github.com/repos/lgs/scrubyt", "pushed_at": "2009-02-25T22:49:08Z","homepage": "http://scrubyt.org",  "forks": 1,"language": null,"fork": true,"html_url": "https://github.com/lgs/scrubyt","created_at": "2009-02-25T22:22:40Z", "open_issues": 0,"private": false,"size": 188,"watchers": 3,"owner": {"url": "https://api.github.com/users/lgs","login": "lgs","id": 1573,"avatar_url": "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2Fgravatar-140.png"},"name": "scrubyt","description": "A simple to learn and use, yet powerful web scraping toolkit!"}]

Then I wrapped it with somedata (you can use any name here):

{somedata:[{"url":"https://api.github.com/repos/lgs/scrubyt", "pushed_at": "2009-02-25T22:49:08Z","homepage": "http://scrubyt.org",  "forks": 1,"language": null,"fork": true,"html_url": "https://github.com/lgs/scrubyt","created_at": "2009-02-25T22:22:40Z", "open_issues": 0,"private": false,"size": 188,"watchers": 3,"owner": {"url": "https://api.github.com/users/lgs","login": "lgs","id": 1573,"avatar_url": "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2Fgravatar-140.png"},"name": "scrubyt","description": "A simple to learn and use, yet powerful web scraping toolkit!"}]}

And I was able to see the record in Mongo.

> db.repo.findOne()
{
    "_id" : ObjectId("4dff91d29c73f72483e82ef2"),
    "somedata" : [
        {
            "url" : "https://api.github.com/repos/lgs/scrubyt",
            "pushed_at" : "2009-02-25T22:49:08Z",
            "homepage" : "http://scrubyt.org",
            "forks" : 1,
            "language" : null,
            "fork" : true,
            "html_url" : "https://github.com/lgs/scrubyt",
            "created_at" : "2009-02-25T22:22:40Z",
            "open_issues" : 0,
            "private" : false,
            "size" : 188,
            "watchers" : 3,
            "owner" : {
                "url" : "https://api.github.com/users/lgs",
                "login" : "lgs",
                "id" : 1573,
                "avatar_url" : "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2Fgravatar-140.png"
            },
            "name" : "scrubyt",
            "description" : "A simple to learn and use, yet powerful web scraping toolkit!"
        }
    ]
}

Hope this helps!


this worked fine with me after I removed any '\n'. You can use tr in linux cat file.json | tr -d '\n' > file.json


Using both the answers provided by @Daniel and @lobster1234 I created a script which I use to import the json entries into mongo.

#!/bin/sh

if [ -z "$1" ] ;
then
    echo "missing argument"
    exit -1
fi

FILE=${1%%.json}

echo $FILE

cat $FILE.json | tr -d '\n' > $FILE.import.json

mongoimport --collection collection --db main --file $FILE.import.json --jsonArray --upsert
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜