Showing posts with label Cloud Computing. Show all posts
Showing posts with label Cloud Computing. Show all posts

Tuesday, 11 March 2014

Beyond Cloud Computing & Mobile Apps: Big Tech Trends

Forget cloud computing and mobile applications — they're so 2010.

So what are the next “wow factor” tech trends, ideas and products that will rock the world of financial advisers and their clients in the not-so-distant future?


Chances are, they will en-compass the wizardry of “big data” algorithms, wearable tech for go-anywhere advisers, video-game-inspired business applications, deep content analysis by supercomputers such as IBM's Watson, and software that has an uncanny ability to read facial expressions and emotions.

In financial services, which was once a leader in technological change, the wow factor is now more likely to come from the consumer market, according to Neesha Hathi, senior vice president of Advisor Technology Solutions at The Charles Schwab Corp.

“Technology used to come through defense and the government to business, and then make its way down to consumers as the cost became more effective,” Ms. Hathi said. “But since the early 2000s, more often the new innovation is coming from the consumer side of the world. As soon as someone marketed the iPad to consumers, they said to themselves, "Wait, I can use this in my business.'”

In an effort to identify emerging technology that will likely have a profound affect on the delivery of financial advice in not-so-distinct future, InvestmentNews talked to some of the best and brightest minds in adviser technology. We compiled a list of five important technological trends that financial advisers cannot afford to ignore.

Ram Nagappan, chief information officer at Pershing, looked at our list and concluded that many of the technologies presented here will change advisers' lives sooner rather than later.

“We feel that the future is already here,” he said.

“We're looking at all these technologies to apply to advisers so we [can] deliver the best experience to them and the end investor.”

BIG DATA

Jeff Bezos, founder and chief executive of Amazon.com Inc., has made no secret of his ambition to collect as much data as possible on affluent consumers so that he can sell them not just books and other media but electronics, household appliances and even groceries.

Amazon's success serves as inspiration to tech teams in the financial services industry, which are studying how to use big-data analytics and statistical probability to better know their customers, including advisers and their clients.

Big data is so big that even the smartest of technophiles have a hard time managing it. This is because it encompasses a huge flow of information about customers, products and services that companies have been gathering for years.

Much of this information, whether collected from traditional or digital databases, has moved into the cloud and continues to grow exponentially.

“With the explosion of big data and analytics, how do you digest all that information?” said Victor Fetter, chief information officer at LPL Financial.

“We believe it's a gold mine — the challenge is that it's moving fast. You have to adapt and create new models,” Mr. Fetter said.

Patrick Yip, director of Advisory Technology Strategy at Pershing, said that one of the first times he truly appreciated how big data works was when he received a Google Now alert on his Android smartphone telling him that commuter traffic was getting heavy where he lived, so if he wanted to beat the rush, he should leave home right away.

“Context is something that knows you and your preferences and location, and then responds to it,” he said.

Pointing to Amazon, which uses big-data algorithms to recommend products based on something that a consumer has previously purchased, Mr. Yip said that Pershing is looking for similar apps that it can recommend to advisers.

SMART OFFICE

The integrated smart office may be more of a designer's dream than a reality.

But in just a few years, advisers can expect to work with wearable devices, office products and even furniture that use cloud technology and integrated software platforms to help facilitate conversations with clients, said Ed O'Brien, senior vice president of Fidelity Institutional's platform technology.

Technology will be less visible as computers disappear into user-friendly hardware, he said, noting that Fidelity has designed an “office of the future” prototype on its Smithfield, R.I., campus that shows registered investment advisers how they will use all that new technology to better engage with their clients.

“You won't see a lot of physical servers and technology infrastructure,” Mr. O'Brien said. “You'll instead see more-collaborative workspaces with lots of mobile technology and integrated technologies.”

Fidelity's office of the future includes a smart coffee table that lets clients sit in a casual office setting with advisers while browsing the web, sharing reports and benchmarking themselves against investment goals.

Tablet presentation-sharing technology, meanwhile, allows for collaborative review of quarterly reports and can be accessed remotely. And a cloud-based virtual desktop for RIAs lets advisers work from anywhere they have access to a web browser.

Improved video conferencing and better gadget management also will catch on in the smart office. For example, the Consumer Electronics Show in Las Vegas in January offered a glimpse of where video is headed, with a Sony projector that can turn an entire wall into a TV screen, and an Intel smart bowl that someday will charge gadgets simply thrown into it.


Wearable technology, too, is headed advisers' way, Ms. Hathi said.

She pointed to Google Glass, the Pebble smartwatch and the Fitbit activity tracker, saying that what seems like a fun gadget will become a valuable business tool.

“We are just now exploring how wearables will be used in wealth management,” Ms. Hathi said.

Fidelity was the first major brokerage firm to make a public foray into wearable technology six months ago when the Fidelity Labs research and development unit was granted early access to Google Glass and created a Glassware app that lets wearers focus their vision on a logo of a publicly traded company to generate a real-time market quote, according to analysts at online and mobile research firm Corporate Insight.

'GAMIFICATION'

Advisers take their work seriously, so the idea of bringing game dynamics into their practices to encourage desired client behavior can make them nervous. But consumer websites and online communities have been using game mechanics to motivate participation and loyalty for years.

Gail Graham, chief marketing officer at advisory firm United Capital Private Wealth Counseling, has used its Money Mind Analyzer to work with 45,000 clients and prospects since 2010.


Money Mind's web app is played as a question-and-answer game by couples to determine whether each partner is most driven by fear, happiness or a need to commit.

The game leads couples to United Capital's Honest Conversation advice program, which comprises about 10,000 retail households, Ms. Graham said.

More participants in the financial services industry are starting to venture into the new frontier of “gamification.”

Investing platform Kapitall Generation, for example, lures investors onto its platform by letting them play with a “fun and easy” $100,000 practice portfolio before trading for real.

In addition, game mechanics are being used by banks to draw in new digitally connected customers, according to Forrester Research Inc.

For example, PNC Bank's “Punch the Pig” game prompts customers to transfer money from their spending accounts to growth accounts.

Closer to the advisory world, custodians are leading the charge into gamification. For example, Pershing, at its annual Insite conference in June, used online game design to educate conference-goers about its NetX360 platform for advisers.

Also, Fidelity Labs has introduced a “Beat the Benchmark” experiment with online gaming in its office of the future's smart coffee table.

SUPERCOMPUTING

International Business Machines Corp.'s supercomputer, Watson, won “Jeopardy” in 2011 because it could sort and analyze vast amounts of data faster than its human competitors.

IBM is actively seeking to use Watson for industrial applications, and the supercomputer is moving into the realm of financial planning.

On a “Watson in finance” web page on its website, IBM states that Watson is being designed as “the ultimate financial services assistant,” capable of performing deep content analysis and evidence-based reasoning to help advisers make informed decisions about investments, trading patterns and risk management.

Jon Patullo, TD Ameritrade Institutional's managing director for technology product management, is positive about this development, saying that he can see the value in supercomputers sifting through massive amounts of data, such as prospectuses, to help drive efficiencies in advisers' practices.

As a further sign of the supercomputer's growth, IBM said Feb. 26 that its Watson Group had launched a global competition to encourage developers to create mobile consumer and business apps powered by Watson.

MIND READING

Mind reading used to be an illusion invented by magicians and tricksters, but in the future, advisers will be able to do some conjuring of their own with voice, mood and facial analytics.

Although mood and facial analytics haven't yet entered the financial services arena, Pershing is using voice analysis, a technology that is catching on at call centers. Customer calls to Pershing are analyzed for empathy expressed by company representatives, silent time on calls and behavioral cues when customers use phrases such as “I'm so frustrated” and “I can't believe this takes so long.”


Beyond voice, cloud-based emotion capture technology now under development uses computer vision to recognize viewers' emotional responses to products and services.

Is the client happy, sad or confused? The software reads pixellated facial features, assessing shapes to infer how a person is feeling.

Already, products such as Affectiva Inc.'s Affdex, Emotient.com, Face.com, Noldus Information Technology's FaceReader and Sightcorp, have arrived on the market to provide companies with consumer analytics based on age, gender, eye tracking, facial expressions, mood and attention level. For example, Sightcorp's webcam eye-tracking software lets companies detect where product users' attention is focused in a controlled lab setting.

Expect to see mood and facial analytics enter the advisory industry in the next five to 10 years, said Oleg Tishkevich, chief executive of financial planning platform Finance Logix.

He predicts that advisers will be able to scan emotional feedback and metrics on how clients are responding to investment proposals or opportunities.

“As the adviser speaks to clients either in real time or on video, the software will read facial expression as they talk about their financial plan, and see if they're happy or sad, and recognize what is and isn't interesting,” Mr. Tishkevich said. “Anything that has a camera, including Google Glass, can be used to read emotion.” 

Monday, 8 April 2013

Microsoft, Google, and Apple: Which one faces doom in 2017?

Last week Gartner released yet another report predicting what the market for computing devices will look like in 2017.

And tech pundits have run with it, churning out one sensational headline after another: Microsoft will be obsolete, its influence is fading fast, it is sliding into irrelevance. And my favorite, “Gartner May Be Too Scared To Say It, But the PC Is Dead.” One could write a pretty good parody of the Monty Python “dead parrot” sketch just using the headlines.

There are two problems with what happened last week.
First, it’s Gartner, which has a track record of being spectacularly wrong with its predictions. Like the time in 2006 (and no, that is not a typo) when Gartner asserted that Apple’s only path to success was to quit the hardware business completely and license the Mac to Dell. Or the rolling forecasts in 2009 that started with Gartner projecting the “sharpest unit decline in history” and ended up with a report of “the strongest growth rate [in PC shipments] in seven years.”
Now, in fairness to the analysts who wrote this report, I think they have identified some likely trends. Sadly, those genuine insights are getting lost because they’re surrounded by tables full of numbers that are so specific as to be ludicrous.
But even if you take their numbers at face value, you need to actually understand them. With a few exceptions, most of the quick-and-dirty rewrites of Gartner’s press release got the story exactly wrong.
And that’s the second problem. All those reports focused on one shiny thing and ignored everything else in the report. Here, I’ve used my virtual yellow magic marker so you can see Gartner’s data as superficially as all those bloggers did:
ww-device-shipments-2012-17-gartner-620px
Right. The market for conventional desktop and notebook PCs is declining, because people increasingly value mobility in the devices they use to perform basic computing tasks. So, Gartner predicts a 20 percent decline in demand for big, desk-bound PCs and conventional notebooks, most of which are heavy devices that remain on a desktop full time.
But what’s that line right below the highlighted one? What’s an Ultramobile?
The good folks at Gartner helpfully defined the term for CNET last summer:
Gartner describes the combination of ultrabooks and the MacBook Air as "ultramobile notebooks." Typically, ultramobile laptops are under 3.5 pounds and less than 0.8-inches thick.
In other words, these are lightweight PCs, typically with keyboards and trackpads, powered by the same operating systems used on those heavier desktop and conventional notebook models. Microsoft’s two-pound Surface Pro is a perfect example of this type of lightweight PC/tablet. So are hybrid Windows 8 devices like HP’s Envy X2, Samsung’s 500T and 700T, and even Dell’s 3.3-pound convertible XPS 12. Ultrabooks and MacBook Airs, which are the equivalent of PCs and MacBook Pros in every dimension except weight and thickness, are counted in that line too. In other words, some PCs are getting considerably lighter, but they’re still PCs.
So let’s redo Gartner’s numbers, this time combining the PC and Ultramobile lines.
pcs-and-ultramobiles-2017-gartner-apr2013-620px
Wow, that’s a completely different story. Large, heavy, general-purpose PCs are becoming less popular, but demand for lightweight devices that can still function as general-purpose PCs is soaring. If you do the math, you’ll see that the increase is projected to be about 881 percent from 2012 to 2017. That phenomenal growth rate in the Ultramobile category means that overall, the number of shipments of devices running desktop operating systems (like Windows and OS X and even Chrome OS) will probably increase by 5 percent between 2012 and 2017.
At an average of about 340 million devices per year, that means roughly 1.7 billion new PCs (including 250 million or so in the Ultramobile category) will reach the market between 2013 and 2017, also known as the Windows 8 era. Not exactly a dead category.
If you trust the numbers, that is, which is a pretty big if.
(A side note from that CNET story: Last July Gartner said it expected about 10.7 million ultramobile units to ship in 2012. Gartner’s final tally for the year was 9.8 million, more than 8 percent lower than its projection just six months earlier. Likewise, last July they projected that the number of ultramobiles shipped in 2013 would be “about 17 million.” Nine months later, they’ve revised that projection upwards to 23.6 million, a change of about 39 percent in just nine months. Think about that before you get too transfixed by the detailed projections for 2014 and 2017.)
And what about that "obsolete,” “irrelevant,” “fading fast” Microsoft?
Well, again, if you trust in Gartner’s numbers enough to write a “Microsoft is doomed” blog post, you really need to look at all the numbers. Here, let me help.
os-families-2017-gartner-apr2013-620px
[Data from Table 2 in this report, with RIM's tiny numbers added to the much larger "Other" category. I added percentages and trendlines.]
Wait, what? That same Gartner report says that Microsoft will struggle in 2013 and 2014 but then will dramatically increase its share of the overall market by 2017?
Exactly. Here’s what Gartner said in their summary press release:
In the shares of operating systems (OSs) in device sales, the shift to mobile and the fight for the third ecosystem becomes more evident. Android continues to be the dominant OS in the device market, buoyed by strong growth in the smartphone market (see Table 2). Competition for the second spot will be between Apple's iOS/Mac OS and Microsoft Windows.
I think that sounds about right.
Apple isn’t interested in winning market share at any costs. They want the high-margin customers. Microsoft is doing its best to build new-format devices that can work well in corporate environments where management is important. Android and Windows are both fighting aggressively to win share in emerging markets. The real loser is “Other.”
And before you start high-fiving Google over their complete dominance, it’s worth noting that Google’s direct share of the Android ecosystem might be a lot smaller than either of its two rivals. As my colleague Jason Perlow pointed out last week, the open nature of Android is a great blessing and an even greater curse for Google. Samsung, the largest maker of Android devices in the world, “will diverge from Google's OS and become a legitimate fork.” So will Amazon.
ZTE, Lenovo, and Huawei service primarily a domentic market in China, and will run their own weird domestic builds of Android with state-approved social networking software to keep the Chinese government happy...
This leaves us with no less than four, five, or six distinct forks of Android. Google as represented on Nexuses or Google Experience devices; Amazon; Samsung; HTC/Facebook; and whatever weird beast ends up running for domestic Chinese use. And BlackBerry 10's Dalvik implementation.
If you strip away the sensational headlines, the real story is pretty prosaic. The worldwide market for computing devices is changing rapidly, and three ecosystems (one of which is highly fragmented) have excellent prospects of becoming large enough to be taken seriously over the next five years.
Unless things change, which they always do.
Now go ahead and spin a clickbait headline out of that story. I dare you.

Check detail: http://www.zdnet.com/microsoft-google-and-apple-which-one-faces-doom-in-2017-7000013637/

Friday, 15 February 2013

Cloud Data Management System: Modern superset of an RDBMS

Modern superset of an RDBMS

A CDMS is a modern superset of an RDBMS designed to meet the needs of modern, 21st century applications and deployment infrastructures.
In addition to delivering the full range of RDBMS capabilities, including SQL, ACID transactions, and supporting all of the tools and APIs that come with them, a CDMS must:
• Support modern datacenter hardware and management frameworks,
• Meet peak workload demands,
• Handle structured and unstructured datasets, and
• Support non-SQL paradigms.
In particular, a CDMS must embrace modern dynamic and flexible cloud computing environments.

Elastic Scale-out for Extreme Performance

A CDMS must deliver capacity on demand by adding or deleting computational and storage resources in a running database. A CDMS must be able to elastically scale out to very high transaction volumes – in the millions of transactions per second (TPS) – and web-scale database sizes – in the petabytes of data – by the addition of real or virtual machines, networks and storage systems to a live database. A CDMS must also scale in gracefully when resources are no longer needed.

Single Logical Database

No matter how complicated the application a CDMS must present its users the view of a single, logical, consistent and always available database. A CDMS must shield users from having to employ explicit partitioning, sharding or caching techniques to achieve massive database scalability. The CDMS must obviate or encapsulate these complexities, so that a developer or administrator can focus on using the database no matter the scale or complexity.

Run Anywhere, Scale Anywhere

A CDMS must be able to run on any infrastructure from single machines to private clouds, public clouds and combinations of the above. It must be able to run in a heterogeneous environment incorporating different machines, virtual machines, operating systems, or network infrastructures. A CDMS should excel on enterprise and commodity hardware equally.

Nonstop Availability

A CDMS must be capable of running continuously – for months or years – without failing or being made unavailable for maintenance.
A CDMS cannot have a single point of failure. It must presume infrastructure failure and be self-aware to detect it, handle systems changes and recognize extreme events like network partitions. It should remain available, or if impossible then fail in a graceful and consistent fashion. It should be able to decide how to react to network partitions, either by failing some portion of the system or understanding how to reconcile changes when the network is stabilized.
A CDMS must support live rolling upgrades of hardware, operating systems and CDMS versions, and must support dynamic changes to schemas and other database administration tasks without shutting down CDMS availability.

Dynamic Multi-tenancy

A CDMS must be dynamically multi-tenant. It must be able to manage large numbers of databases on a finite set of resources, and be able to reassign resources to databases as needed. A CDMS must be able to hibernate inactive databases and wake them on demand.

Active/Active Geo-distribution

A CDMS must be able to run concurrently in multiple datacenters to support geographically distributed workloads, always-on applications, and for disaster recovery. A CDMS must deliver active/active operations with transactionally consistent semantics, work across and between Wide Area Networks and understand how to localize activity or caches.

Embrace Cloud

A CDMS must integrate and run in a cloud environment, and be designed to support cloud-scale performance requirements while being resilient against the inherent concurrency and latency challenges. It must be able to provide transactions per second (TPS) and load rate guarantees and maintain those rates as latency spikes and concurrent load grows. It must support cloud management frameworks and integrate with modern cloud stacks.

Store Anywhere, Store Redundantly

A CDMS must be able to store the data anywhere: Locally, remotely, in a datacenter or on a public or private cloud. A CDMS should also be able to store the data in whatever storage system is appropriate: on a directly attached file system, a local Key/Value store or on a cloud-based storage service. It should be able to store all data redundantly in multiple locations, simultaneously and with transactional consistency, using a heterogeneous mix of storage locations and storage technologies.

Workload Mix

A CDMS must be flexible in the kinds of workloads it supports, and efficient in running different workloads concurrently. It should be able to support highly scalable web-facing applications with primary requirements that include high transaction throughput, web-scale concurrency and very low latency. It should be able to support enterprise applications that involve complex transactions and a more even mix of reads and updates. It should be able to support analytical applications, with a premium on un-cached reads and long-running queries. A CDMS should also support logging-style applications with a focus on sustained appending of data.
A CDMS must be able to perform backups without taking the system down, and run analytical queries without interfering with transaction processing.

Tunable Durability Guarantees

A CDMS must allow a user to define infrastructure reliability constraints that control the trade off between durability guarantees and database performance. A user must be allowed to define whether a transactional commit means that the data is safely written to storage in one place, written to storage in K places, written to storage in M non-collocated places, stored in RAM in N places, or something else.

Distributed Security

A CDMS must have enterprise class security at system level and database level, including:
• Authentication and access control of machines before they are accepted into the trusted group,
• Authentication and access control of database processes before they are allowed to participate in a particular database,
• Encryption of all communications between machines, and
• Database-level security for users of the database.

Empower Developers & Administrators

Empowering Developers:
• A CDMS must support rapid application development and frictionless application evolution,
• It should be easy to use, without time-consuming requirements for provisioning of database servers, or inflexible schemas that slow down application development,
• It must be integrated with modern programming languages and APIs, database development tools and application development frameworks,
• It must support flexible schemas with user-defined types in order to provide a clean layer for arbitrary language integration that is agnostic to row or column orientation. A CDMS should enable users to easily update and redefine data as their applications change.

Empowering Administrators:
• A CDMS should provide a single secure point of administration for all its databases and resources. It should make it simple to automate logging, auditing, profiling, process management and resource allocation,
• It should enable policy-driven, zero-admin services that manage the system as a whole,
• A CDMS should also support the separation of database administrators and systems administrators as roles using this single point of management as these roles are more distinct in a cloud environment.

Thursday, 14 February 2013

Health care and Cloud Computing

For years, hospitals have longed to bring computers into the exam rooms, waiting rooms, and treatment rooms to get rid of hard-to-read patient charts, make sure everyone treating a patient was seeing the same information, record everything from vital signs to care delivery, and let doctors, nurses, and hospital techs stay connected to vital information and services as they move throughout the hospital.

Cloud computing offers significant benefits to the healthcare sector; Doctor’s clinics, hospitals, and health clinics require quick access to computing and large storage facilities which are not provided in the traditional settings, moreover healthcare data needs to be shared across various settings and geographies which further burdens the healthcare provider and the patient causing significant delay in treatment and loss of time. Cloud caters to all these requirements thus providing the healthcare organizations an incredible opportunity to improve services to their customers, the patients, to share information more easily than ever before, and improve operational efficiency at the same time. The flip side of this advantage is that healthcare data has specific requirements such as security, confidentiality, availability to authorized users, traceability of access, reversibility of data, and long-term preservation. Hence, cloud vendors need to account for all these while conforming to regulations such as HIPAA and Meaningful use.

Indeed, the cloud computing market in the health care sector is expected to grow to $5.4 billion by 2017 at a CAGR of 20.5% from 2012 to 2017, according to research firm Markets and Markets. Although cloud computing offers significant advantages to HCOs and other stakeholders, it has set of restraints. Security of patient information, interoperability and compliance with government regulations are some of the factors which are slowing down this market.

The health care sector is beginning to move to cloud-based platforms, despite the common belief that compliance and security issues would hinder the shift. The major driving factors are the need to increase storage and compute capacity using limited dollars and the ability to centrally manage patient data that now exists in silos.

Despite this growth, many in health care are still pushing back on cloud computing, citing security and privacy issues. But others are finding better security models and technology in the cloud. Moreover, most health care organizations moving to cloud computing are doing so to reduce operational costs, because many have very limited budgets -- a powerful motivation that will overcome the overblown security and privacy excuses.

Still, this transition won't be pain-free. Most IT organizations in the health care sector don't have the talent required to move their systems safely to cloud-based platforms, and they may not understand the compliance and security issues as well as they should. However, the default of "do nothing" is not acceptable considering that the IT backlog is growing again -- and budgets are not. It's time to get creative and innovative around the use of new technology, including cloud computing.

That's especially good for health care, which should get a much higher return on investment than other sectors will from cloud adoption. The amount of data that health care providers must deal with is daunting, and it is typically managed in unconnected silos. That causes huge costs both for management and in inefficiencies, including some that lead to mistreatment due to ignorance among those treating patients as each has only some of the picture.

In moving to the cloud, the health care industry will find new opportunities for data consolidation or aggregation of patient data to help physicians and clinicians make better decisions, while their organizations should save money through reduced redundancy and cheaper operational costs. It's time for health care to capitalize on the cloud opportunity -- as smart organizations have already realized.


Different Segments of Cloud Computing market in healthcare are as:
The scope of the report spans the cloud computing market in healthcare which comprises:
  • Global healthcare cloud computing market, by applications
    • Clinical Information Systems (CIS)
      • Electronic Medical Records (EMR)
      • Picture Archiving and Communication System (PACS)
      • Radiology Information System (RIS)
      • Computerized Physician Order Entry (CPOE)
      • Laboratory Information System (LIS)
      • Pharmacy Information System (PIS)
      • Others
    • Non Clinical Information Systems (NCIS)
      • Revenue Cycle Management (RCM)
      • Automatic Patient Billing (APB)
      • Cost accounting
      • Payroll
      • Claims management
  • Global healthcare cloud computing market, by pricing model
    • Pay-as-you-go
    • Spot pricing
  • Global healthcare cloud computing market, by deployment model
    • Public cloud
    • Private cloud
    • Hybrid cloud
  • Global healthcare cloud computing market, by components
    • Software
    • Hardware
    • Services
  • Global healthcare cloud computing market, by service model
    • Software-as-a-Service (SaaS)
    • Platform-as-a-Service (PaaS)
    • Infrastructure-as-a-Service (IaaS)
  • Healthcare cloud computing market, by geography
    • North America
    • Europe
    • Asia
    • Rest of the World (ROW)

Monday, 11 February 2013

TOP 15 Emerging Technologies

Research firm Forrester understands that everyone who’s been listening with even one ear knows that mobile, social, cloud, and data are big freight trains of change that are blowing up old business models and old business practices.

TOP 15 Emerging Technologies are in four groups:

End user computing technologies

 1.Next-generation devices and UIs
New sensors and new user interfaces. Think Leap Motion
 2.Advanced collaboration and communication
Think social inside, like Yammer or or other social-inside-the-enterprise solutions
 3.Systems of engagement
Real-time data, in everyone’s hands. Think Roambi

Sensors and remote computing technologies

 1.Smart products
Thing that can sense, react, and communicate. Think operating system for places and buildings
 2.In-location positioning
GPS and in-building location sensors
 3.Machine-to-machine networks
Background intelligence on people and things. Think ReelyActive

Process data management technologies

 1.Smart process applications and semantics
Real business processes are a lot messier than your flow charts. Smart process apps know that.
 2.Advanced analytics
Smarter, more predictive data. Think Cloudera’s Impala tool for Hadoop
 3.Pervasive BI
People need business intelligence that comes every hour, not at the end of the month
 4.Process and data cloud services
Scalable, burstable, and cheap computing capability. PaaS, BaaS, etc.

Infrastructure and application platforms

 1.Big data platforms
Infrastructure to handle big data and high speed … and use all that data you’ve been uselessly storing
 2.Breakthrough storage and compute
Yes, hardware may still be necessary, even if you’re never going to be like Google
 3.Software-defined infrastructure
Software that dynamically routes your networking and data center capabilities
 4.Cloud application frameworks
Technologies for deploying and running distributed apps in the cloud, like, perhaps, a multi-continent-spanning database
 5.New identity and trust models
New federated trust and identity models for a changing world of jobs and careers … and maybe even killing all usernames and passwords

Read more at http://venturebeat.com/2013/02/07/forresters-top-15-emerging-technologies/#Gy0g4kfil8OcMOUw.99

Sunday, 27 January 2013

Cloud Computing: Pros & Cons

ON THE UPSIDE

1. Fast start-up

"Cloud computing is really a no-brainer for any start-up because it allows you to test your business plan very quickly for little money. Every start-up, or even a division within a company that has an idea for something new, should be figuring out how to use cloud computing in its plan," says Brad Jefferson, CEO of Animoto, a New York company that creates full-motion videos out of customer-selected photos and music. "Cloud computing has changed the game for entrepreneurs -- the greatest part about it is that on launch day, you have the confidence that you scale to the world."

2. Scalability

To figure out if you're a good cloud service prospect, first consider the variability of the resource utilization of your own IT structure, says Tom Nolle, CEO of CIMI, a high-tech consulting firm. "If you've got enormous peaks and valleys, you're forced to oversupply IT resources to address the peaks. It may be significantly less costly for you to outsource the peaks," he says.

3. Business agility.
"Your mind really changes quickly when you can solve problems using IT resources but you don't need a long-term commitment and you don't have to wait a long time to get them," says Michael Crandell, CEO of RightScale, a cloud management and support company. "Cloud computing changes the whole pattern of agility at a much lower cost."

4. Faster product development
Since moving some applications and data to Amazon's cloud last April, Eli Lilly & Co. has seen provisioning time drop from weeks to minutes, says Dave Powers, associate information consultant at the Indianapolis company. "If I can give scientists eight weeks back on their research, that's a huge value there," he adds. "This is really starting to impact how we do business. We're starting to reduce cycle times in research, which is critical for us. That's a trickle-down effect of technology that we can make available to the scientific community."

5. No capital expenditures

Are you out of space in your data center? Have your applications outgrown the infrastructure? Cloud computing services allow a company to shift from capital to operational expenses even in do-or-die cases, says Bernard Golden, CEO of HyperStratus, a consulting firm specializing in advanced IT technologies.

ON THE DOWNSIDE
1. Bandwidth could bust your budget

Such was the case at Sony Pictures Image Works, which considered then ruled out an external cloud service to address storage scalability challenges, says Nick Bali, senior systems engineer at the Culver City, Calif., company. Every day, Sony animators access and generate between 4 and 12 terabytes of data. "The network bandwidth we'd need to put that into someone's cloud and to read it back is tremendous, and the cost would be so large that we might as well buy the storage ourselves rather than paying someone else for it," he says. Now Sony is evaluating a private storage cloud, using ParaScale's cloud storage software.

2. App performance could suffer

A private cloud might, but a public cloud definitely wouldn't lead to improved application performance -- not when taking network latency into account, says Tony Bishop, CEO of Adaptivity, a consulting firm specializing in next-generation IT infrastructure.

"I couldn't see an investment bank putting a latency-sensitive application on an external cloud," adds Steve Harriman, a vice president at NetQoS.

3. Data might not be cloud-worthy

"On Day 1, we probably had eight to 10 applications that we would have loved to take into the cloud," says Eli Lilly’s Powers. "But, knowing the type of data we had and the classification [of who could see it], we decided going through internal governance and rigor around taking care of that data would be appropriate." And, definitely don't put an application that provides competitive advantage or contains customer-sensitive information in the public cloud, Bishop adds.

4. Too big to scale

"The bigger you are, the bigger your IT resource pool. And the bigger your IT resource pool, the less likely it is that you'll see any enormous financial advantage in outsourcing to the cloud," CIMI's Nolle says. "Cloud computing promotes better resource utilization, … but the gains are greatest when moving from relatively small consumption of resources upwards. If you're a very large enterprise, you might find you can achieve better economy by doing your own cloud than going to an outsourced one."

5. Human capital may be lacking
Exploring next-generation IT models requires an adventuresome spirit and technical astuteness, says HyperStratus' Golden. "If you don't have the human capital that's willing to stretch and learn new things, taking on cloud computing can be very frustrating."

For details kindly visit:

http://www.networkworld.com/supp/2009/ndc3/051809-cloud-pro-con.html

Thursday, 3 January 2013

Cloud Computing


Cloud Computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility over the Internet.  As a metaphor for the Internet, "the cloud" is a familiar cliché, but when combined with "computing," the meaning gets bigger and fuzzier. Some analysts and vendors define cloud computing narrowly as an updated version of utility computing: basically virtual servers available over the Internet. Others go very broad, arguing anything you consume outside the firewall is "in the cloud," including conventional outsourcing.

A simple example of cloud computing is Yahoo email, Gmail, or Hotmail etc. You don’t need software or a server to use them. All a consumer would need is just an internet connection and you can start sending emails. The server and email management software is all on the cloud (internet) and is totally managed by the cloud service provider Yahoo, Google etc. The consumer gets to use the software alone and enjoy the benefits. The analogy is, 'If you need milk, would you buy a cow ?' All the users or consumers need is to get the benefits of using the software or hardware of the computer like sending emails etc. Just to get this benefit (milk) why should a consumer buy a (cow) software / hardware?

Overview

Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Parallel to this concept can be drawn with the electricity grid, wherein end-users consume power without needing to understand the component devices or infrastructure required to provide the service.

Cloud computing describes a new supplement, consumption, and delivery model for IT services based on Internet protocols, and it typically involves provisioning of dynamically scalable and often virtualised resources. It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet. This may take the form of web-based tools or applications that users can access and use through a web browser as if the programs were installed locally on their own computers.

Cloud computing providers deliver applications via the internet, which are accessed from web browsers and desktop and mobile apps, while the business software and data are stored on servers at a remote location. In some cases, legacy applications (line of business applications that until now have been prevalent in thin client Windows computing) are delivered via a screen-sharing technology, while the computing resources are consolidated at a remote data center location; in other cases, entire business applications have been coded using web-based technologies such as AJAX.

At the foundation of cloud computing is the broader concept of infrastructure convergence (or Converged Infrastructure) and shared services. This type of data center environment allows enterprises to get their applications up and running faster, with easier manageability and less maintenance, and enables IT to more rapidly adjust IT resources (such as servers, storage, and networking) to meet fluctuating and unpredictable business demand.

Most cloud computing infrastructures consist of services delivered through shared data-centers and appearing as a single point of access for consumers' computing needs.

Comparison

Cloud computing shares characteristics with:

Autonomic computing — Computer systems capable of self-management.

Client–server model — Client–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).

Grid computing — "A form of distributed and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks."

Mainframe computer — Powerful computers used mainly by large organisations for critical applications, typically bulk data processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.

Utility computing — The "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity.

Peer-to-peer — Distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional client–server model).

Characteristics

Cloud computing exhibits the following key characteristics:

Agility improves with users' ability to re-provision technological infrastructure resources.

Application programming interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.

Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure. This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).

Device and location independence enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.

Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for: Centralisation of infrastructure in locations with lower costs (such as real estate, electricity, etc.)

Peak-load capacity increases (users need not engineer for highest possible load-levels)

Utilisation and efficiency improvements for systems that are often only 10–20% utilised.

Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery.

Scalability and Elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads.

Performance is monitored and consistent and loosely coupled architectures are constructed using web services as the system interface.

Security could improve due to centralisation of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security.
Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer.