Efficient Query for Extended Details off an Association Object Pattern
Details
- A
Person
has manyObjective
s. Objective
s havePerson
-specific details aboutActivity
s.- An
Activity
contains generic information such as a world record. - A
Person
can organize anEvent
to attempt theObjective
. - A
Person
invites otherPerson
s to watch anEvent
with anInvitation
.
Schema
Note: Only backref's are listed on the example schema diagram, indicated by "(fk)". The arrows imply the normal relationship.
Image Link Until I Get 10 Points To Use Image Tag
开发者_Python百科Question
I want most Event
, Objective
, and Activity
details for all Invitation
s one Person
received (irregardless of status, but the status is still needed) displayed at once.
Is there a better way to represent the problem before I try tackling a JOIN like this? I believe the Person
-> Invitation
<- Event
is an Association Object pattern, but I am unsure of how to get the Objective
and Activity
information in a clean, efficient manner for each Invitation
returned.
Bonus: Provide sample SQLAlchemy query.
On the SQL side, this is pretty straightforward. I built some tables for testing using only one id number (for persons); all the rest of the keys were natural keys. Looking at the execution plan for this query
select I.*, A.activity_placeholder, E.event_location, O.objective_placeholder
from event_invitations I
inner join activity A
on (I.activity_name = A.activity_name)
inner join events E
on (I.personal_id = E.personal_id
and I.activity_name = E.activity_name
and I.objective_deadline = E.objective_deadline
and I.event_time = E.event_time)
inner join personal_objectives O
on (I.personal_id = O.personal_id
and I.activity_name = O.activity_name
and I.objective_deadline = O.objective_deadline)
where I.person_invited_id = 2;
shows that the dbms (PostgreSQL) is using indexes throughout except for a sequential scan on event_invitations. I'm sure that's because I used very little data, so all these tables easily fit into RAM. (When tables fit in RAM, it's often faster to scan a small table than to use the index.)
The optimizer estimates the cost for each part of the query to be 0.00, and you can't get much better than that. Actual run time was less than 0.2 milliseconds, but that doesn't mean much.
I'm sure you can translate this into SQLAlchemy. Let me know if you want me to post my tables and sample data.
精彩评论