The traditional world of databases which house arcane customer records, inventory, your Facebook profiles and Google listings are being shook up by the increasing popularity of a new kind of data model called NoSQL.
This disruptive technology scorns a world of structured data relationships and forces a change in the way computer programmers and database administrators view information.
It may well be the perfect foil for cloud computing which is transforming on-premise IT departments.
New application start ups like CouchDB are using something called replication via a peer-to-peer model (Think Napster) to replicate information instantly between different websites, applications and devices.
For instance, the entire customer Rolodex in a cloud CRM system like Salesforce.com could be instantaneously replicated to a local mobile device or laptop.
For decades web developers and programmers have data mined traditional databases such as mySQL and SQL server using a language called SQL to add, modify and delete records.
CouchDB does something radically different. It cuts the lines of code needed to perform these operations by up to 90% and uses a replication model to mine haphazard data such as documents or files to make sense of the information universe.
This document-oriented database is written in an old language called Erlang, which has been used for years to manipulate telephone switches.
Instead of mySQL the cloud programmer of the future must familiarize himself with something called 'views' to make sense of the data.
If a network of sites share information using the CouchDB model they would continue to operate even if one of the sites went down.
One implication of the scenario above is that if you are no longer satisfied with your web host, for example, Amazon web services, you can automatically replicate data to your local device or to another hosting company instantly.
No downtime. No hassle.
In essence, this mirrors the self-healing nature of private and public virtual dedicated servers offered by Virtual Internet.
And, since the rise of cloud computing goes hand in hand with the surge in mobile devices, CouchDB is perfect for mobile app development because it occupies a tiny data footprint.
What it lacks in precision it makes up for by being light, nimble and hyper-fast.
If Wikileaks had been built using this replication model there would be virtually no chance the site would ever go down unless government agencies acted unilaterally to shut down all sites at once consuming the CouchDB technology. This would be challenging!
It remains to be seen just how far this new application will challenge the traditional database world but it implies true data independence for IT departments investigating public and private clouds.
On May 26, 2008 IBM broke a mystical barrier that had existed in computer operations for over a decade: A sustained speed of 1 million billion calculations per second or what is known as "1 Petaflop."
The computer that performed this feat is called RoadRunner and it's a beast costing over $133 million to build.
This goal had been set back in 2002 and programmers pinned their hopes on a Linux operating system and a hybrid design consisting of two AMD Opteron ™ dual-core processors plus four PowerXCell 8ITM processors used as computational accelerators.
This is very similar technology to that used in the Sony Playstation 3 and considered a 'unique' approach to building a supercomputer.
Would you believe that while lightening fast this computer is actually only the 4th fastest supercomputer in the world according to ZDnet?
The machine is example of the massive computing power that will soon be accessible via the Internet for scientific, research and military purposes.
We refer to this as cloud computing or computing "on-demand" which is fast, cheap, elastic and easy to operate. Virtual Internet offers smaller computational resources through VMware and Xen deployments we call virtual dedicated servers.
The Roadrunner has now been installed at the Las Alamos National Laboratory in New Mexico, USA.
What's next? Big Blue has decided that it is time to compete in the legendary game show called "Jeopardy" in the United States. This is a natural evolution in public relations since it defeated chess grandmaster Kasparov several years ago.
The RoadRunner system (and related computing resources) will be nicknamed "Watson" and will push the boundaries of natural language processing in supercomputers.
Spoiler alert: Watson destroyed the competition. You can see how this went in this video
In a recent Whitepaper, we highlighted that 80% of all computing will take place in the cloud by 2020 with most of these interactions originating from Smartphones and other wirelessly connected devices.
One interesting device being tested by the Microsoft Research Cambridge group is something called a cloud mouse, which is finding new ways to interact with virtual dedicated servers over the Internet.
“Given the resources of cloud computing,” says Richard Harper, a principal researcher with the Socio-Digital Systems group at Microsoft Research Cambridge, “a two-dimensional desktop layout is no longer sufficient to capture or convey rich, real-time relationships between data, people, schedules, or places. Cloud computing calls for new interaction metaphors, and these metaphors necessitate new input-output technologies.”
If you seen the movie Tron, Minority Report or Eagle Eye or used a virtual controller with an X-box game, you can immediately visualize some of the capabilities of this exciting beta project which has brought together 45 of the leading human-computer interaction (HCI) specialists in the world.
Here is one scenario they are working on:
“… Data is presented through handheld projectors or augmented eyeglasses, or displayed across multiple surface displays as 3-D visualizations. In terms of hardware, the Cloud Mouse is the key to this interactive experience.
When a user moves the Cloud Mouse through these data visualizations, differentiated sensory outputs such as vibrations or sound will alert users to locations where they can they can retrieve or view information, post or store information, or steer closer toward a target.”
If you watch the video attached to the article you will note how the mouse user is able to zoom into past and future events and tasks in a visual manner, almost like a video game.
In one demonstration, the controller drags a picture/video from one computer to another wireless connected device in real time. Very cool.
The ultimate goal of all this research project to find natural ways to interact with remote data in the cloud.
Data is being assembled into a set of reports, white papers and multimedia under the banner “Welcome to Being Human: Human-Computer Interaction in the Year 2020.”
Virtual Internet Exemplify World-class Business Standards with ISO 9001 & 27001 Certifications
Virtual Internet (VI) – one of UK’s leading business web hosting providers – has shown customers worldwide their commitment to the highest quality management & information security management requirements by being awarded the ISO 9001 and ISO 27001 certifications, respectively.
Renowned provider of business-class web hosting solutions, Virtual Internet, are proud to announce they’ve received two official International Standardization Organization (ISO) certifications (the ISO 9001 and ISO 27001). These are two crucial steps toward ensuring the top-notch quality of their product, increased customer satisfaction, cutting-edge marketing & organisation strategies, as well as a state-of-the-art data protection, enhanced benchmarking and more.
The ISO 9001 standard specifies Quality Management System (QMS) requirements, focused on an organisation’s ability to measure up to and improve upon customer satisfaction as well as service quality requirements. It is an outstanding and reliable customer service that separates ISO 9001-certified companies from the rest of the market. Since 1996, Virtual Internet has been exemplifying this standard, providing their services with exceptional quality, reliability, and performance. Now the company has a globally recognised certification that it has an effective QMS and all of the international standards are met.
Sought after by organisations that manage sensitive information and need to continuously uphold data security, ISO 27001 is the latest standard designed exclusively for Information Security Management Systems (ISMS). The ISO 27001 certification includes a number of control categories such as information security policy, security organisation, personnel security, access controls, physical security, etc. When it comes to the hosting of critical data, a sophisticated ISMS is an absolute must. The ISO 27001 accreditation emphasises VI’s commitment to protecting customers’ valuable information by adhering to the most rigorous procedures governing security and business continuity.
“Achieving both these certifications proves the fact that high standards in the protection of customers’ business, information and solution assets is one of the VI greatest priorities,” said Yana Kryuchkova, Marketing Manager, Virtual Internet.
“But it’s just a small part of what we call VI-tal Support Promise. With VI-tal Support our customers not just get 24/7 professional technical support from our VCP qualified engineers but complete peace of mind that lets them stop worrying and get on with running their business. So as we like to say at VI: “Providing excellent service and support for our customers is not just important – it's vital.” And it’s not just words - it’s the way we do business.” – she said.
For more information about Virtual Internet please visit www.vi.net
About Virtual Internet
Since 1996 Virtual Internet has been a pioneer in the UK co-location and dedicated hosting market offering industry-leading SLAs and 100% network uptime guarantees. It was also the first web host to deliver both Xen and VMware (private and public) cloud servers to the UK enterprise market. Each hosting package is augmented by free VI-tal Support packages that’s allowed it to build strong client bases in the finance, government, education, health, media and travel sectors.
In our last blog post we highlighted that some data centers are now exceeding 1 million square feet in size. These centers draw massive power and consequently contribute to elevated carbon emissions in the atmosphere.
Some estimates reveal that there are now 15 million servers deployed by IT departments worldwide with more to come.
The carbon footprint generated by these data centers equals the emissions ejected by the airline industry and it's getting worse.
To put this issue in perspective and how cloud may come to the rescue, consider the Akamai business model and what it does:
If you use the Internet for anything - to download music or software, check the headlines, and book a flight - you've probably used Akamai's services without even knowing it. Today Akamai handles tens of billions of daily Web interactions for companies like Audi, NBC, and Fujitsu, and organizations like the U.S. Department of Defense and NASDAQ -- powering brand new business models that serve the changing online economy.
Quite simply if Akamai customers like Audi or NBC had to leave Akamai and build their own infrastructure it would require something like 500,000 extra servers. However by consuming Akamai cloud this scope is limited to 40,000 servers.
Likewise, by leveraging Virtual Internet's Content Delivery Network (CDN) for content and digital media delivery you are consuming an efficient cloud service rather an increasing the world carbon footprint by building out your own network.
Cloud computing promises to minimize the unfolding event of increased emissions by allowing shared infrastructure "on demand" via the Internet.
Now large corporations do not have to build bloated on-premise data centers to accommodate their spiraling growth and manage their increased complexity.
It is becoming increasingly apparent as we head for 2015 that IT data centers may well be one of the biggest contributors -- perhaps up to 80% -- of the growing carbon footprint.
Of course, anytime you decide to download music off Rhapsody instead of driving your car down to the Virgin Record store this does act as a counterweight to data center emissions. But this is not enough.
IT departments can make a huge dent in global emissions by offshoring their applications, platforms and infrastructure to hosting providers who offer both private and public virtual dedicated servers.
This strategy could be a game changer offering these specific and immediate benefits:
- Capacity on demand
- Higher performance
- Lower cost because infrastructure is shared
The decision to offshore your data center to a hosting provider either via a cloud or colocation model must only be made if the host can provide round the clock support. For instance, if you decided to consume a Virtual Internet private or public cloud you can expect FREE 24/7 VI-tal support, which never sleeps. This feature should be non-negotiable in any web host you select.
If you would like to find out more how cloud computing can help reduce your carbon footprint and maximize green IT please contact us here.