They have a few, and some heavily armed security in some of them as well. I had a tour of one when a company I worked for was looking for a secondary DC. Bulletproof glass, assault rifles, body scanners, and a few other nutty things. Those gambling establishments have to store their data somewhere.
Depends on the data. Your family photos and home videos from the 90s you digitized doesn’t matter where it is. If it is financial data or a stock market exchange then speed and thus nearness play a big factor.
Web performance is interesting, if you like that nerdy stuff. More than half of people will abandon a website at initial startup at the 3 second latency mark. Most companies aim for initial load of 1s. After initial load, people will perceive elements of the site as ‘functional’ if they return in 200 ms or less. Traversing the web in the US from coast to coast is ~50 ms spent already, so actually doing something with the users data has to be very performant. Outside the US, especially in india-- the latency is pretty terrible. Because of connection maps through different vendors, sometimes you’ll have a situation where one half of a city has very fast times and the other half terrible. Same for mobile networks. Its absolutely the unregulated wild west out there and users dont see it. There’s also bad actors + accidental misconfigs and bad planning all over the place.
So, having both data centers and tons of mini data centers called POPS (points of presence) near large cities are how the internet stays fast. These POPS handle the initial connection and establishing encryption between user and data center application, figure out some best steering of traffic sometimes, and sometimes serve some cacheable info.
Another thing that keeps sites fast are Content delivery services, or CDNs. CDN companies sell their services to other companies on the internet. Their value is in maintaining a high number of POPs and data centers to store your highly cacheable content that changes pretty seldomly, like your profile picture in LinkedIn or facebook. All CDNs do is serve up cached files quickly, using many data ceters and a large number of pops. So when you load up a web page, its fetching stuff not just from the company who owns the site, but from CDNs, and often third party websites as well like trackers, servers that provide fonts and pictures. Its a huge mess.
Video and voice is the new frontier of this stuff, and its more complication that sits on top of the previous complication. It demands low latency all the time, so a lot of it is deployed in POPs or via peer to peer connections when possible.
Are we sure it’s fresh? Are we excluding projects like data centers with seaside cooling loops?
Do you think if you exclude those it isn’t still a major fresh water wastage issue during a water crisis?
Here’s where just Google’s data centers are. How many of do those do you think are cooled with sea water?
Edit: Yes really. They have a DATA CENTER in LAS VEGAS.
They have a few, and some heavily armed security in some of them as well. I had a tour of one when a company I worked for was looking for a secondary DC. Bulletproof glass, assault rifles, body scanners, and a few other nutty things. Those gambling establishments have to store their data somewhere.
Isn’t the whole point of a data center that you don’t need to be near one to store your data?
Depends on the data. Your family photos and home videos from the 90s you digitized doesn’t matter where it is. If it is financial data or a stock market exchange then speed and thus nearness play a big factor.
Then I guess the only option is to shut down Las Vegas and never go back. I’d be fine with that.
Web performance is interesting, if you like that nerdy stuff. More than half of people will abandon a website at initial startup at the 3 second latency mark. Most companies aim for initial load of 1s. After initial load, people will perceive elements of the site as ‘functional’ if they return in 200 ms or less. Traversing the web in the US from coast to coast is ~50 ms spent already, so actually doing something with the users data has to be very performant. Outside the US, especially in india-- the latency is pretty terrible. Because of connection maps through different vendors, sometimes you’ll have a situation where one half of a city has very fast times and the other half terrible. Same for mobile networks. Its absolutely the unregulated wild west out there and users dont see it. There’s also bad actors + accidental misconfigs and bad planning all over the place.
So, having both data centers and tons of mini data centers called POPS (points of presence) near large cities are how the internet stays fast. These POPS handle the initial connection and establishing encryption between user and data center application, figure out some best steering of traffic sometimes, and sometimes serve some cacheable info.
Another thing that keeps sites fast are Content delivery services, or CDNs. CDN companies sell their services to other companies on the internet. Their value is in maintaining a high number of POPs and data centers to store your highly cacheable content that changes pretty seldomly, like your profile picture in LinkedIn or facebook. All CDNs do is serve up cached files quickly, using many data ceters and a large number of pops. So when you load up a web page, its fetching stuff not just from the company who owns the site, but from CDNs, and often third party websites as well like trackers, servers that provide fonts and pictures. Its a huge mess.
Video and voice is the new frontier of this stuff, and its more complication that sits on top of the previous complication. It demands low latency all the time, so a lot of it is deployed in POPs or via peer to peer connections when possible.