Esitmating and budgeting with a "Compute Hour"
-
Hi, so with the return of utility computing (aka cloud computing), I am curious what the best way to estimate what a budget would be based on concurrent users, and possibly application complexity? In other words, how can I estimate how many 'compute hours' I would need to pay for every month? Suppose I have 100 business users on a SaaS application, using either EC2 or Azure or similar. If we estimate that of these 100 users, there are about 50 concurrent users through an 8 hour work day, then 160(hours per month)*50(users) = 8000 "compute hours" per month. but somehow I don't think that's right. And also if your application is multithreaded and possibly taking advantage of more than one core per user per hour, then your estimate goes up. Maybe I dont understand the concept completely but with the pricing plans available today then it seems like it would be cheaper to go the traditional route of having dedicated web and DB servers with a reputable hosting provider. For example, Azure costs[^] 0.12 per compute hour. So if we take my estimate 8000hrs * $0.12 = $960 per month = $9.60 per user per month (and thats just computing time not to mention bandwidth, storage, db, etc). Does this also seem expensive to anyone? Am I missing something completely? Thanks!
-
Hi, so with the return of utility computing (aka cloud computing), I am curious what the best way to estimate what a budget would be based on concurrent users, and possibly application complexity? In other words, how can I estimate how many 'compute hours' I would need to pay for every month? Suppose I have 100 business users on a SaaS application, using either EC2 or Azure or similar. If we estimate that of these 100 users, there are about 50 concurrent users through an 8 hour work day, then 160(hours per month)*50(users) = 8000 "compute hours" per month. but somehow I don't think that's right. And also if your application is multithreaded and possibly taking advantage of more than one core per user per hour, then your estimate goes up. Maybe I dont understand the concept completely but with the pricing plans available today then it seems like it would be cheaper to go the traditional route of having dedicated web and DB servers with a reputable hosting provider. For example, Azure costs[^] 0.12 per compute hour. So if we take my estimate 8000hrs * $0.12 = $960 per month = $9.60 per user per month (and thats just computing time not to mention bandwidth, storage, db, etc). Does this also seem expensive to anyone? Am I missing something completely? Thanks!
I guess that "computing time" is the time that you're using the CPU in a normal computer. Open up your task-manager and look at the load on your CPU, you'll find that it's often idle, doing nothing. Now go to the "Processes" tab, click the menu "View" and then "Select Columns" - you'll find a column called "CPU time". That's the time that a CPU spends on raw processing for that application. I don't think that you need to pay for the idle time during those 8 workhours, effectively reducing your cost to a fraction of the original calculation.