Dynamically reference tables without using dynamic SQL
In SQL Server 2008, I have a stored procedure that executes a complex query. I will write the results to a temp table (not a system temp table, but a manually managed one).
The name of the table is the string representation of the user's web session ID. This solves concurrency issues. This will be passed to the stored procedure, which will create the table. To clean up, the name will be logged in a maintenance table, along with a timestamp.
However, the name is obviously dynamic. I don't want to use dynamic SQL as this precludes any compilation of the complex query. I know this is pushing things, but what I am looking to do is as follows:
DECLARE @TableName varchar(100)
SET @TableName = CAST(@SessionID AS varchar)
INSERT INTO @TableName
SELECT....{complex query}
How can I do this?
More info:
The recordset 开发者_StackOverflowcould be tens of thousands of rows (IDs only). The user can apply further filtering to this recordset, thus reducing it. Therefore it has to be a table on the server.
Why don't you create a temp table of choice, but include the session ID as a field which will make the records unique?
You could insert your complex query into a #temp table first and then use dynamic sql to insert it into a sessionID named table after.
select ...
into #temp
....{complex query}
set @sql = 'select * into ' + @tablename + ' from #temp'
exec (@sql)
How would it compile a query with a table it doesn't know anything about at compile time? As far as I know, what you want to do is not possible.
Why don't you look for other solutions, like storing the results in an XML field in a table keyed on session ID, one row per user/session/result-set?
You also could pull the data back to .NET and store it in Session directly. Depends which session provider you're using how well this will work.
You could also store the results in a temp file and track the files in a single table with a time stamp and run a batch job every so often to clean out files from older or inactive sessions.
Added, based on "More Info":
OK. Time to play the "premature optimization" card. :)
Do you have an existing solution that is already too slow? Have you looked at it for ways to streamline it without caching a large result set?
Assuming you have: How many users/sessions do you expect to have? How will these IDs be used once they are isolated and cached? They sound like they are needed for further queries, which means the best place for them is in a table that has a static name.
At this point my suggestion is: store all the results in an association table like create table tempResults(session_id guid, other_id int)
. Then you can hard-code joins to this table into other queries.
精彩评论