My company develops a spatial planning solution since 2009. Initially it was an ArcMap application to create and maintain spatial plans (competing with AutoCAD folks) but recently we developed "publish them to web" functionality.
Our SDE database design is based on "templating". There is a feature dataset that acts as a template for a new spatial plan. Once a new spatial plan is created we clone that feature dataset into a new feature dataset that will act as a new spatial plan. There are about 10 feature classes (polygons, lines, points, annotations) and 6 relations in the plan feature dataset.
Thus we end up with a separate feature dataset for each spatial plan. This design also helps us to solve juridical issues who can see a spatial plan and when: we set different privileges on plan feature dataset as plan goes through its status cycle (from being designed to official).
During last two years our solution became popular in target market. And here is the problem: customers with 1000 spatial plans say they want to enter 2000 more in one year. This is not a problem for our marketing and sales departments :)
But for development the problem is here: 1000 feature datasets in SDE database (SQL server or Oracle) are hell slow! Say, it takes 5 minutes to open them in ArcCatalog... Upgrading geodatabase to our next version (topologies, representations)? Leave it for the night...
People from my company did not get answer to this question during this year summit.
Do you have experience with such designs? How did you solve the performance issues?