开发者

How do I speed up (or break up) this MySQL query?

I'm building a video recommendation site (think pandora for music videos) in python and MySQL. I have three tables in my db:

video - a table of of the videos. Data doesn't change. Columns are:

CREATE TABLE `video` (
    id int(11) NOT NULL AUTO_INCREMENT,
    website_id smallint(3) unsigned DEFAULT '0',
    rating_global varchar(128) DEFAULT '0',
    title varchar(256) DEFAULT NULL,
    thumb_url text,
PRIMARY KEY (`id`),
KEY `websites` (`website_id`),
KEY `id` (`id`) USING BTREE
) ENGINE=InnoDB AUTO_INCREMENT=49362 DEFAULT CHARSET=utf8

video_tag - a table of the tags (attributes) associated with each video. Doesn't change.

CREATE TABLE `video_tag` (
    id int(7) NOT NULL AUTO_INCREMENT,
    video_id mediumint(7) unsigned DEFAULT '0',
    tag_id mediumint(7) unsigned DEFAULT '0',
PRIMARY KEY (`id`),
KEY `video_id` (`video_id`),
KEY `tag_id` (`tag_id`)
) ENGINE=InnoDB AUTO_INCREMENT=562456 DEFAULT CHARSET=utf8

user_rating - a table of good or bad ratings that the user has given each tag. Data always changing.

开发者_运维问答CREATE TABLE `user_rating` (
    id int(11) NOT NULL AUTO_INCREMENT,
    user_id smallint(3) unsigned DEFAULT '0',
    tag_id int(5) unsigned DEFAULT '0',
    tag_rating float(10,5) DEFAULT '0',
PRIMARY KEY (`id`),
KEY `video` (`tag_id`),
KEY `user_id` (`user_id`) USING BTREE
) ENGINE=InnoDB AUTO_INCREMENT=447 DEFAULT CHARSET=utf8

Based on the user's preferences, I want to score each unwatched video, and try and predict what they will like best. This has resulted in the following massive query, which takes about 2 seconds to complete for 50,000 videos:

SELECT video_tag.video_id, 
       (sum(user_rating.tag_rating) * video.rating_global) as score 

FROM video_tag 
JOIN user_rating ON user_rating.tag_id = video_tag.tag_id
JOIN video ON video.id = video_tag.video_id 

WHERE user_rating.user_id = 1 AND video.website_id = 2 
AND rating_global > 0 AND video_id NOT IN (1,2,3) GROUP BY video_id 
ORDER BY score DESC LIMIT 20

I desperately need to make this more efficient, so I'm just looking for advice as to what the best direction is. Some ideas I've considered:

a) Rework my db table structure (not sure how)

b) Offload more of the grouping and aggregation into Python (haven't figured out a way to join three tables that is actually faster)

c) Store the non-changing tables in memory to try and speed computation time (earlier tinkering hasn't yielded any gains yet..)

How would you recommend making this more efficient?

Thanks you!!

--

Per request in the comments, EXPLAIN SELECT.. shows:

id  select_type table   type    possible_keys   key key_len ref rows    Extra
1   SIMPLE  user_rating ref      video,user_id  user_id 3   const   88  Using where; Using temporary; Using filesort
1   SIMPLE  video_tag   ref      video_id,tag_id    tag_id  4   db.user_rating.tag_id   92  Using where
1   SIMPLE  video       eq_ref  PRIMARY,websites,id PRIMARY 4   db.video_tag.video_id   1   Using where


  • Change the field type of the *rating_global* to a numeric type (either float or integer), no need for it to be varchar. Personally I would change all rating fields to integer, I find no need for them to be float.

  • Drop the KEY on id, PRIMARY KEY is already indexed. video.id,rating_global,website_id

  • Watch the integer length for your references (e.g. video_id -> video.id) you may run out of numbers. These sizes should be the same.

I suggest the following 2-step solution to replace your query:

CREATE TEMPORARY TABLE rating_stats ENGINE=MEMORY
SELECT video_id, SUM(tag_rating) AS tag_rating_sum 
FROM user_rating ur JOIN video_tag vt ON vt.id = ur.tag_id AND ur.user_id=1
GROUP BY video_id ORDER BY NULL

SELECT v.id, tag_rating_sum*rating_global AS score FROM video v 
JOIN rating_stats rs ON rs.video_id = v.id 
WHERE v.website_id=2 AND v.rating_global > 0 AND v.id NOT IN (1,2,3)
ORDER BY score DESC LIMIT 20

For the latter query to perform really fast, you could incorporate in your PRIMARY KEY in the video table fields website_id and rating_global (perhaps only website_id is enough though).

You can also use another table with these statistics and precalculate dynamically based on user login/action frequency. I am guessing you can show the cached data instead of showing live results, there shouldn't be much difference.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜