JChung2006 wrote:I would really hate to be the guy who has to maintain Raindog's code after he is gone...
lol, so true.
Loading User Information from Channel 9
Something went wrong getting user information from Channel 9
Loading User Information from MSDN
Something went wrong getting user information from MSDN
Loading Visual Studio Achievements
Something went wrong getting the Visual Studio Achievements
kidzi wrote:OK, few things... is there a clean up strategy? (30 days old things are removed, etc)
kidzi wrote:How does the tables record count get blown out of whack? What do you mean by this?
kidzi wrote:I understand the versionid column, but what non-version related operations are you doing, and if you have indices on those, then what is the real issue? An index isn't a bad thing....
kidzi wrote:It seems that if they are performing what ifs and making as many rows as they are, that creating tables for each of those situations will be a nightmare to maintain - even if it is done automatically, because now you'll have to know which table to go into, and looking at the table from a dbo perspective will be much much harder to aggregate and manage.
If there is Real data and WhatIf data, then instead of making tables for each version, I'd make a set of six tables for the WhatIf scenarios and put all of the user generated stuff in there. Then the live data is where it needs to be, and the what if can be separated off (and you can do cleanup operations on there at intervals, or however you want to do it).
kidzi wrote:If you put those tables in a different database, like "Live.Table1" and WhatIf.Table1" then you get the benefit of being able to use the same sprocs for both scenarios so you do not have to rewrite all of the 'what if' sprocs, so it is as close to production as possible.
thumbtacks2 wrote:The idea of the "what if" scenarios makes me think of using queries instead and manipulating data in Excel via pivot tables (if its possible)...or by using a reporting tool. How often are these user-created db tables reused?
I have a group of about 6 datatables that collectively CAN contain millions of records (it is based on the user). Users have the ability to create a personal version of the data contained in the 6 tables so that they can modify/add/delete/whatever they
want to the data to perform "what-if" scenarios and not affect the live data. The design originally placed a column in all 6 tables called VersionID that was part of the respective primary keys, allowing each record in all 6 tables to be replicated with a
modified version identifier. So when a user created a new version, it would copy all data in each table, apply a new VersionID value to the data and re-insert it into each table.
This solution seems to be pretty crappy in terms of performance/design.
1) The tables get cluttered with tons of data that is user created.
2) The tables record counts get blown out of whack if excessive versions are made, which in turns kills performance.
3) In order to use these tables with any sort of effeciency, an index needed to be put on the VersionID column to allow for selecting of records by it, but adding that index affected the other non-version related operations on each table.
4) Other issues...
My question is: what would be a better approach to this solution? Would it be better to create a copy of each of the tables for each version? So that the data would be seperated and data operations would perform better since each version would be isolated. Plus when the version is deleted later, I could easily perform DROP TABLEs on each copy which would be basically instant and wouldn't jack the index of the live tables with deleting millions of records.
Any input is appreciated.
Has anyone developed a decent sized application that used the SessionPageStatePersister
as opposed to the default HiddenFieldPageStatePersister? If you had a web app that was getting hit pretty severely, I wonder what would produce better performance results.
Thanks in advance for any thoughts.