Announcing the launch of Data Center Discovery!

The complexity and growth rate of the data center marketplace can make sourcing the best products and services a challenge. Identifying, locating and qualifying suppliers of data center products and services is often a time consuming and confusing process.

Other industries utilize helpful on-line tools that allow purchasing agents to make efficient and educated buying decisions. Web based services such as GlobalSpec and ThomasNet allow purchasers in the manufacturing sector to perform comprehensive searches for the products and services that they need from detailed and extensive directories of suppliers.

Until now, the data center industry has lacked a central directory that collects corporate profiles and contact information for the companies that make up the data center marketplace. Data Center Discovery fills that need.

Data Center Discovery is a worldwide, online directory of firms that are involved in the data center construction and operations industry. For the first time, all the players in the data center marketplace have a professional and well organized venue to promote their capabilities and value to a receptive audience.

This collection of valuable information will allow purchasers of data center products and services to identify suitable business partners and make informed purchasing choices.

Visitors to the website can browse the directory free of charge. Business can also post their profiles free of charge. However, more feature rich profiles are available for a orporations wishing to be listed in the directory will pay a recurring monthly fee. The amount of the monthly fee is based on the number of categories, geographies and specialties under which the service provider appears.

For more information please visit the site at www.datacenterdiscovery.com

DARPA is working on new chip design. Promises high efficiency, powerful compute and…low accuracy?

Photo credit DARPA

The Defense Advanced Research Projects Agency (DARPA) is the US Department of Defense (DoD) agency responsible for the development of new technologies for use by the military. 

The scientists at DARPA routinely launch projects so audacious and ambitious that their project list looks like something conjured up by science fiction writers and Hollywood directors rather than legitimate scientists.  Thought controlled bionic arms? Check.  (See project Proto2).  Iron man style powered exoskeleton?  You bet.  (See project XOS)  Battlefield telepathy? Yep. (See project Silent Talk)  Throw in a half dozen autonomous robots and an assortment of super-soldier tactical gear and you start to get a picture of a group that lives on the bleeding edge of science and engineering. 

Given the radical technology innovations these guys are dreaming up it’s no surprise that they’re getting frustrated with the slow pace of computer chip efficiency improvements.  When your projects include surveillance systems that can “track everything that moves” in an entire city (CTS), you obviously need computers with serious processing power.  But, just as importantly, these DARPA projects also require computers that make efficient use of electricity. 

Per chip processing power has continued to double every 18 months (roughly in accordance with Moore’s Law.)  However, chip energy efficiency has reached a near dead end.  In other words, power scaling has all but ceased.  As a result, battery powered devices can’t keep up with the energy demands of the computer chips.      

To address the chip efficiency issue DARPA is throwing away the digital rule book and designing a new generation of ANALOG computer chips.  Instead of using the energy intensive, Boolean logic strategy of driving voltage into transistors to change their state from zero to one, Darpa is examining low power “probabilistic” computing possible using analog computing.

Daniel Hammerstein, DARPA program manager for project UPSIDE, expects intelligence, surveillance and reconnaissance (ISR) systems utilizing the new technology to be faster and “orders of magnitude more power-efficient.”

How does it work?  According to the DARPA press release; “UPSIDE envisions arrays of physics-based devices (nanoscale oscillators may be one example) performing the processing. These arrays would self-organize and adapt to inputs, meaning that they will not need to be programmed as digital processors are. Unlike traditional digital processors that operate by executing specific instructions to compute, it is envisioned that the UPSIDE arrays will rely on a higher level computational element based on probabilistic inference embedded within a digital system.”  (A super fast, super efficient, self organizing, self programming nano-computer?  What could possibly go wrong?)  

Probability computing abandons the its-either-one-or-zero straightforwardness of digital computing.  As a result, it sacrifices some accuracy.  For imaging and surveillance applications, the inexact nature of analog, probability processing may be sufficient. 

Ben Vigoda, the general manager of the Analog Devices Lyric Labs group seems to think that the technology may be applicable to the problem of energy consumption by data centers and server farms.  In an article for Wired, Vigoda stated, “We’re using a few percent of the U.S.’s electricity bill on server farms and we can only do very basic machine-learning,” says Vigoda. “We’re just doing really, really simple stuff because we don’t have the compute power to do it. One of the ways to fix this is to design chips that do machine-learning.”

Maybe.  But a large portion of those server farms using all that electricity are financial sector facilities.  I don’t see them (for example) rushing out to cut their power bills by introducing errors into their data. 

For the foreseeable future, information systems still require three core characteristics; Confidentiality, Integrity and Availability (CIA).  I don’t see many takers for computing strategies that sacrifice one of these core principles for better energy efficiency.

Progressive Insurance looking for data center engineer

Progressive is looking for an experienced engineer to help manage a 24X7 Data Center environment, while repairing and maintaining electrical, mechanical, and plumbing systems or equipment.

If you’re interested in being considered, please go to http://progressive.jobsync.com to learn more about the job and complete your confidential, five-minute candidate profile.

For more information please contact:
Wilhelm Ong
JobSync
wilhelm.ong@jobsync.com

Largest magnetic storm of the season to strike this weekend

On Thursday July 12, 2012 the sun let loose with a huge solar flare and coronal mass ejection. The X1.4 class solar flare emerged from a sunspot in active region 1520 at 12:11Pm EDT. 

The flare unleashed a staggering amount of energy roughly equivalent to a billion hydrogen bombs. That energy is now headed toward the Earth at a blistering 850 miles per second.

Solar scientists are expecting a moderately severe magnetic storm when the CME strikes the Earth’s magnetic field this weekend.  Alex Young of NASA’s Goddard Space Flight Center stated, “This could produce auroras as far south as northern California and Alabama [and] into central UK and Europe or southern New Zealand.”

As previously discussed on this blog, geomagnetic storms can disrupt electrical and communication systems that are vital to the operation of data centers and other mission critical facilities.

Solar scientists warn that this magnetic storm could lead to intermittent satellite/radio navigation problems, surface charging on satellites and power grid fluctuations.

There’s a Little Black Spot on the Sun Today

Astounding Photo of Massive Sunspot Cluster (Active Region 1476)

Astrophotographer Thierry Legault captured this amazing shot of Sunspot Cluster (Active Region 1476).

This sunspot cluster is approximately 120,000 miles across.  The big spot on the spinward side is roughly 60,000 miles across.  When you consider that the Earth is only 8,000 miles wide you start to understand how truly massive these structures are. 

Despite its staggering size and high magnetic activity levels, this sunspot cluster has been relatively quiet.  The spot has produced a number of small flares but nothing close to the X class flares or CMEs that can disrupt terrestrial communications and utility power.  The strongest flare from this spot so far was an M1.4 class flare on 2012 May 5.

Hopefully, 1476 will remain mild-mannered despite its size. 

Also shown in this magnificent photo is the new Chinese space station.  If you look carefully directly west of the sunspot cluster you can pick out a small winged shape.  That’s Tiangong-1, the first module of China’s planned space station transiting the sun.

Technology Blogs - Blog Rankings

Net Environmental Benefit (NEB); A Data Center Metric to Satisfy Greenpeace

Environmentalists crack me up.  I dated one once when I was in the Navy.  I’ll call her Mary for the purposes of this story.   One evening during our brief relationship, I was telling Mary some (unclassified) stories about submarine life.  I casually mentioned that we spend a lot of time shooting sea slugs for practice.  Mary was shocked and appalled. 

For those of you unfamiliar with subs, when you test torpedo firing systems on a fast attack you fill the torpedo tubes with seawater and fire the “slug” of water as if it were an actual torpedo.  It’s a good simulation and completely harmless.  However, in Mary’s imagination, we were prowling the ocean floor, hunting unwary sea slugs and blowing them into watery oblivion. I probably should have corrected Mary’s thinking but it was too much fun watching her agonize over the harmless sea creatures we were cruelly using for target practice.

That should be the end of the story…but it’s not.  Mary and I broke up after a few weeks.  She didn’t seem to take it too badly at the time.  But, the next time our crew returned to port, we found flyers under the windshield wipers on every car in long term parking.  The flyers pleaded in bold letters, “Save the Sea Slugs!” and went on to describe the Navy’s cruel vendetta against harmless marine animals.  The flyer demanded that naval officials cease the caviler and unnecessary destruction of sea life.  Oh my, how we laughed!

I hoped that I was done with this type of nonsense when I left the Navy and entered the data center industry.  No such luck.  Environmentalist juggernaut Greenpeace has been after Facebook for building data centers in areas where the percentage of electricity generated by coal is too high for their tastes.  Greenpeace rallied over 180,000 followers to their “Unfriend Coal” campaign.  Never mind that Facebook’s data centers are among the most energy efficient and environmentally sustainable buildings ever built.  Never mind that Facebook has shared every efficiency strategy that they employed through the Open Compute project. As a result of Open Compute, the entire data center industry has been able to achieve a more efficient posture.

Facebook has not been the only data center to draw fire from Greenpeace.  Greenpeace has also targeted Apple’s data centers.  Never mind that the Apple data centers are marvels of energy efficiency and sustainable design.  Never mind that Apple is building the largest end user owned solar array in the country at their Maiden, NC data center.  Never mind that Apple is also building the largest biogas/fuel cell installation (outside of utility) in the US at the same data center.  The commitment to the development of green/alternative energy technology demonstrated by Apple is unparalleled.   

In Greenpeace’s misguided and myopic view, data centers consume large amounts of electricity and are therefore bad.  The reality is that these facilities are on the bleeding edge of energy conservation and sustainable design.  These data center are monuments to the fact that the companies that built them and the data center industry as a whole cares deeply about conservation is actively advancing building efficiency to staggering new levels.    

Greenpeace is also missing the big picture at an even more profound level.  Environmentalists should actually applaud the construction of data centers.  Here’s why; data centers are built and applied to existing business models because data center technology provides a business delivery efficiency improvement over the previous business paradigm.  These improvements in business delivery efficiency (usually) result in a net environmental benefit. 

For example; Facebook builds a bunch of data centers and suddenly 800M people share 60B photos online.  As a result, the kiosks, drug stores and grocery stores that used to process film and print photos on paper see that business almost completely disappear.  The home photo printer industry also flattens and begins to decline.  That’s a massive business delivery paradigm shift.  Digital photo sharing on this scale is only possible by applying data center technology.  Now, imagine the net environmental impact of all of those people NOT driving their film or memory card to the store, NOT consuming photo developing chemicals, NOT mailing pictures to relatives, NOT purchasing replacement ink for their printers.  How many miles were not driven?  How many toxic chemical not used?

Amazon is another great example.  Brick and mortar book stores and print media in general are in decline.  Amazon built a more efficient business model to deliver that content to readers.  That business model was enabled by the application of data center technology.  Again, how many miles to the book store were NOT travelled?  How many buildings NOT constructed?  How many trees were NOT felled for their paper?

Both of these examples only scratch the surface of the net environmental benefit enabled by data center technology.

What the data center industry (and the companies that use the technology) need is a new metric.  I’ll call it Net Environmental Benefit (NEB).  NEB will encompass all of the benefits that are enabled by data centers and boil it down to handy three digit integer in units of megatons of carbon.  It will be a bear to calculate but next time the Greenpeace nitwits start protesting, Facebook can roll out its astronomical NEB and squash them with it.

12% Chance of Grid Crippling Solar Storm in Next 10 Years

Today another stunning Coronal Mass Ejection (CME) erupted from the Sun’s corona.  Fortunately, the massive flare was not directed at the Earth. 

A CME like this sends a cannonball of superheated plasma, energetic particles and radiation barreling through space.  If the Earth is unlucky enough to be in the path of one of these monsters the consequences can be dire.  Notable effects can include spectacular auroras, widespread communication failures and massive blackouts.

In 1859 a large solar storm struck the earth in an event known as the Carrington Event.  The Carrington flare caused telegraph stations to burst into flame due to induced currents in the copper transmission lines.  Auroras caused by the event were seen as far south as Havana. 

In 1921, another huge solar storm and geomagnetic event struck the Earth.  John Kappenmann, from the Metatech Corporation (and co-author of the NASA/NRC report on the economic impact of space weather), modeled the 1921 storm on the modern power grid.  The map of power system collapse and utility transformer damage is a stark warning of the threat of space weather to power distribution systems. 

 

More recently, in 1989 a large flare caused the collapse of the Hydro-Quebec power grid.

In 2008 solar scientists predicted that a Carrington scale solar event today could cause blackouts effecting 130 million people and result in economic losses of “$1 trillion to $2 trillion during the first year alone…with recovery times of 4 to 10 years.”

So, the consequences of the Earth being struck by a massive solar storm are potentially severe.  But such an event can surely be classified as a High Impact/ Low Probability/ (HILP) event, Right?  Not so fast my friend.  Pat Riley, a predictive science expert, just published a paper in the scientific journal, SPACE WEATHER: THE INTERNATIONAL JOURNAL OF RESEARCH AND APPLICATIONS in February of 2012.  In his paper Riley places the “Probability of a Carrington event occurring over next decade is ~12%”

Around 12%!  Yikes! Those Doomsday Preppers on NatGeo may be on to something.

I outlined the near earth and terrestrial effects of a massive solar flare/geomagnetic storm in my blog here.

I covered the potential effects of space weather on the infrastructure that supports data centers for Mission Critical Magazine here

Follow

Get every new post delivered to your Inbox.

Join 657 other followers