Coffeehouse Thread

4 posts

SQL Azure ... 50GB limit?

Back to Forum: Coffeehouse
  • User profile image
    Sabot

    Hi All,

     

    Yep, it's been awhile since I was on here. Other stuff going on in my life has been taking priority but it's all going to settle down now and I have a bit more time to concentrate on the tech matters of the day .... so did you miss me? Wink

     

    Anyhoo, I have a multi TB database monster that is looking for a home. So one of the great things about the Cloud is that you don't have to host stuff yourself you effectively rent it fom someone else so you don't have to come up with the mega-money to  build a farm that will handle this kind of monster. So why oh why is there a 50GB limit on SQL databases? ... this is sooooo small!

     

    So please, please, please Microsoft can we up the limit on SQL Azure so I can give my monsters a new home!

  • User profile image
    vesuvius

    I think they must be thinking of the average use case, hence the 50GB cap. Would you be willing to relinquish that amount of data to an unknown entity? What are the benefits, since you already have the hardware and infrastructure to support it (the multi-TB databse)?

     

    How was Oz as I'm sure that was the last time I recall you lurking around '9

  • User profile image
    figuerres

    have you tried to get them on the phone ?

    I bet they can do it just that as was posted a lot of stuff on azure probably is way less than 50 megs.

    I know how you may feel i have a db that will have to have old data moved to a second server soon, it's 100 Gigs and growing...

  • User profile image
    Duncanma

    figuerres said:

    have you tried to get them on the phone ?

    I bet they can do it just that as was posted a lot of stuff on azure probably is way less than 50 megs.

    I know how you may feel i have a db that will have to have old data moved to a second server soon, it's 100 Gigs and growing...

    They can't do this. The cap is setup because of the way they handle replicating the data for you. Of course, there are other companies and hosting services that could let you put this data up *in the cloud*, but Azure maintains at least three copies of your data and handles a variety of automatic replication and backup services; and to do that they cap the size of the database to ensure they can provide the appropriate level of service to all their customers.

     

    Database sharding/partioning is one way to take a multi-TB database and push it across many 50GB database instances...

     

    Question though, what is in this database that is taking up so much space? If it is just regular records then sharding/partioning is probably the route you want to go, but if you are storing FILESTREAM or blob data you could look at pushing that data into other storage systems (blob storage for example).

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.