开发者

Efficient Query for Extended Details off an Association Object Pattern

Details

  1. A Person has many Objectives.
  2. Objectives have Person-specific details about Activitys.
  3. An Activity contains generic information such as a world record.
  4. A Person can organize an Event to attempt the Objective.
  5. A Person invites other Persons to watch an Event with an Invitation.

Schema

Note: Only backref's are listed on the example schema diagram, indicated by "(fk)". The arrows imply the normal relationship.

Image Link Until I Get 10 Points To Use Image Tag

开发者_Python百科Question

I want most Event, Objective, and Activity details for all Invitations one Person received (irregardless of status, but the status is still needed) displayed at once.

Is there a better way to represent the problem before I try tackling a JOIN like this? I believe the Person -> Invitation <- Event is an Association Object pattern, but I am unsure of how to get the Objective and Activity information in a clean, efficient manner for each Invitation returned.

Bonus: Provide sample SQLAlchemy query.


On the SQL side, this is pretty straightforward. I built some tables for testing using only one id number (for persons); all the rest of the keys were natural keys. Looking at the execution plan for this query

select I.*, A.activity_placeholder, E.event_location, O.objective_placeholder
from event_invitations I
inner join activity A 
        on (I.activity_name = A.activity_name)
inner join events E 
        on (I.personal_id = E.personal_id 
        and I.activity_name = E.activity_name
        and I.objective_deadline = E.objective_deadline
        and I.event_time = E.event_time)
inner join personal_objectives O 
        on (I.personal_id = O.personal_id
    and  I.activity_name = O.activity_name
    and  I.objective_deadline = O.objective_deadline)
where I.person_invited_id = 2;

shows that the dbms (PostgreSQL) is using indexes throughout except for a sequential scan on event_invitations. I'm sure that's because I used very little data, so all these tables easily fit into RAM. (When tables fit in RAM, it's often faster to scan a small table than to use the index.)

The optimizer estimates the cost for each part of the query to be 0.00, and you can't get much better than that. Actual run time was less than 0.2 milliseconds, but that doesn't mean much.

I'm sure you can translate this into SQLAlchemy. Let me know if you want me to post my tables and sample data.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜