You spend 50-100x more on your smartphone than Google or FB make from you in ad revenue. They pay for their clouds out of that ad revenue
68 replies and sub-replies as of Oct 10 2016

how do you see this balance for wearables and smart home? Cheaper and smaller devices, more data/sensor oriented
look at the gpu chip stack layer. There's a reason it's used in cloud rather than a normal CPU.
and you mistake asp for cost. Margins for fb and google are MUCH higher than aapl.
do you think I don’t know that? 😁
seems like you don't know a lot of things.
while you know a lot. Right? Discuss idea not the person. @BenedictEvans
wow. This is a dumb comment. You don't take into account parallel processing and utilization.
and that's why it's good to sell hardware (if you can)
So why don't Google/FB sell a home cloud personal server?
because ‘where is the cheapest computation?’ is not the most relevant question for that product. For others it is.
where do you get 50-100x? I get more like 2-5x
does Facebook make $200-300/user?
Google makes $50B/year from say 1B users. So $50/user/year.
$85m run rate, but 3bn users. ARPUs of $13 (FB) and $28. 25-50x, say.
what are you using for the cost of the device?
now? $250-600. Android ASP ~$225
the device lasts more than 1 year. I also think the average is below $200 if you look globally at all 3B users.
One can swing the precise number around - point is that the idea there is ‘infinite compute’ in the cloud v device is wrong
more like "there is (almost) infinite compute in 5Bn devices"-and many more advantages to distribution besides @adamdangelo
yeah, I disagree with the overall conclusion, not just the details
‘I bought this phone last year for $200. You have a global capex of x% of $13/user/year’.
1. most of the phone cost isn't going to computation; 2. the phone is idle most of the time, but cloud servers are working
3. compute per dollar is way more efficient in server settings
A truck can carry more than a car, but your customers drive themselves to wal-mart for free
the vast majority of shipping happens between factories and the local walmart, not in cars driving things home from the store
I think if you go out to 3B users the average is more like $100/year
so that gets you to 4-8x
$200 ASP /$13 ARPU is 15. ASP of FB users rather higher - $300+
That is, Google and FB’s users spend far more on computing power than they can themselves. Edge > cloud, in aggregate
Yes! So much this. Big reason I dug into @monaxHQ's stuff was this potential of what they call "Participatory Architecture"
It makes a lot of sense to try to move as much computing as possible to the devices that your users already paid for.
Depends on the application. Games & storage, sure. But something like FB & Google Search can't be run locally. At all
ah good point
Computing power in a smarpthone is often a LOT more than the amount in the cloud that can economically be allocated to that user.
maybe we should use p2p more for smart phone
if you think of the combined computing power of all our phones as one computer - it's a big one.
Computing on the device uses power, but then so does turning on the radios and transmitting it to the cloud.
But RF uses more, because RF is unpredictable. You have 4 bars, but that means you hear the tower, not vice versa, so boost
And the great thing about ML is that training the model & running it are two different things. Trend: train in the cloud, run on device.
this will work for some algorithms like decision tree. But will fail for most of the clustering algorithms.
what about train and run on the device? Allows business models with lower margins
The parallel was the hype/fashion in 1999 for thin client. Then runtime in browser of course grew. Moore's Law on both sides.
Will we see competitors stealing machine learning parameters / models from each other's apps?
calculating fitness of a sample is usually (ideally) a linear function. Dot product of a vector.
even running the model can be computationally quite expensive depending on the model application
Sure. But turning on the radios might use a lot more battery
not quite , report about the breakdown of smartphone power, screen and CPU way more than wifi
processing image recognition locally uses less power than transmitting the image
could be, if you refer to image filter but for object recognize, must relay to cloud to train algorithm
if you add up all iOS devices, do you think Apple could control more aggregate computing power than Google/FB?
would u tell me what you mean by "computing" !
But compute in the cloud can accommodate oversubscription by factor of 50. #Erlangs
Logical, but why does it seem that apps are moving away from edge computing and into the cloud? Even image processing etc..
Long term though the opposite makes sense. Decrease upfront cost to bring the next half of the world online.
Desktop maybe, mobile it's too much battery drain potential.
transmission uses power. LOTS of power
Fair point. Depends on specifics. Burning straw man sending vs mining a Bitcoin. 😀
I think the real conclusion is that for data-heavy work, you want to move the computation to the data
Yes but anything run on a CLIENT can be manipulated there to cause you grief so careful what you calculate there
If that were the only cost involved.
transmission uses power.
it makes even more sense to tell people it's for privacy reasons.
spend on buying my phone or spend on buying things on my phone?
So, whatever they can put to run in device-level, they will. Forcing you to pay more for the device.