jump to navigation

Forrester outlines top tech trends – from now till 2018 February 7, 2013

Posted by Mark Hillary in Current Affairs, Internet, IT Services, Outsourcing.
Tags: , , , , , , , , , , ,
add a comment

As decision-makers get blindsided about how and when to use emerging technologies, Forrester analyst Bryan Hopkins recently provided some helpful insight into what’s next in his blog.

Grouped in four major blocks, he outlined the top 15 major trends in tech that will be shaking up business models over the next five years.

It is easy to fall into the crystal ball-gazing trap, especially when you are talking about what will happen in technology between now until 2018. But a clear thread can be identified across Hopkins’s predictions.

In the end user computing group, advanced collaboration and computing tools will continue to be of major importance to companies worldwide. This is crucial because of the increasingly dispersed nature of businesses and the war for skilled personnel – you need to get the right people to work effectively together and also retain as much as possible of their knowledge when they move on.

The sensors and remote computing technologies theme refers to the external, customer-facing side of technology. Here, the Promised Land is that of smart machines performing the collection and processing of data, then contextualising it to generate the nuggets of gold that can inform brands on what to offer to consumers, where and how.

But that information doesn’t just appear by magic. So tools that provide advanced analytics capability as outlined in the process data management topic outlined by Hopkins  – that is, digesting and making sense of structured and unstructured information quickly and cheaply – will be very useful to companies focusing on understanding their audience.

Finally, there needs to be a robust platform holding that glue of knowledge together. So, as described in the analyst’s infrastructure and application platforms topic, big data platforms to handle large volumes of data, elastic storage capability and everything-as-a-service will continue to be the talk of the town for the years to come. Are you ready?

Read Bryan Hopkins’s blog entry on emerging trends here.

Crystal Ball

 

Photo by Justin Glass licensed under Creative Commons

Advertisements

No chance of getting caught November 26, 2012

Posted by Mark Hillary in Current Affairs, Internet.
Tags: , , , , , ,
add a comment

A computer hacking group caused losses totalling millions to financial service companies such as Mastercard and PayPal according to this news report of an ongoing case at Southwark Crown Court.

What is striking about the case is that the defendants believed in the concept of safety in numbers – if many people made an attack together then it would be far harder for the law enforcement authorities to prosecute individuals – and there was no gain to be made from the attacks. It was really just a protest.

This stemmed from actions such as making anti-piracy statements, or organisations that had failed to support Wikileaks and therefore became targets for the hackers.

Taken more broadly there is a very serious risk for any business here that the tabloid news coverage fails to mention.

If a company can so easily be prevented from trading by loosely affiliated hacking groups with a very low chance of them being caught and punished then there is a serious commercial risk. Any business stating a view on piracy or freedom of information that upsets someone can be targeted easily. Even crude methods such as flooding a website with hits or messages can take down and disrupt a business.

Information security has long been an integral part of the business strategy for companies such as retailers taking online orders, but it seems now that almost any firm engaging in any online transactions needs to take this seriously – or face a sudden loss of business if they attract the attention of hackers.

Hacking

 

Photo by Miria Grunick licensed under Creative Commons

What is Web 3.0? July 22, 2011

Posted by Mark Hillary in Internet, Outsourcing, Software.
Tags: , , , , , , , , , , , ,
add a comment

The Internet continues to evolve at a frenetic pace. Back in the nineties, having a website meant little more than a series of static pages that used hyper-links to allow the reader to click between pages.

Web 2.0 changed all that. Websites became based on dynamic data, so different readers might see different pages, based on their own profile. Your Facebook profile is a good example – endlessly changing whenever you update it or load new content such as photos. It became normal for readers to also become contributors.

Now the tech world is talking of Web 3.0, even as many in the enterprise are yet to fully take advantage of the dynamic information flow of Web 2.0.

But Web 3.0 is not really here just yet. It revolves around how information can be better linked through concepts such as the semantic web. In short, there will be a point at which the systems are publishing information automatically and tagging or linking the data to existing information. Like Web 2.0, but with the computers doing much of the publishing and linking for us.

The clear advantages of this are obvious. We are drowning in a sea of information at present. Just search Google for ‘John Smith’ and hundreds of millions of possible results come up. If your own name is ‘John Smith’ and the search system had some way of linking data that relates to the correct ‘John Smith’ then search suddenly becomes far more intelligent.

Given the amount of content now being created it is becoming essential for the systems to help connect the dots. For example, the video site YouTube gets 35 hours of new video uploaded by users every single minute. How can we make sense of this vast sea of data if it has no context?

The downside of relying on the technology is that machines make mistakes. Only time will tell how laws designed for a previous era might handle cases related to an automated system linking millions of pieces of data, where some of those links are erroneous and create a knock-on effect that invalidates other data.

It’s a problem we have yet to encounter, but this world is just around the corner not decades away.

Who is the customer? February 24, 2011

Posted by Mark Hillary in IT Services, Outsourcing.
Tags: , , , , , , , , , ,
add a comment

Possibly the most important discussion I had at the Nasscom event in India recently was around the changing role of the CIO, and how this changes the whole relationship between the client and the service supplier.

The CIO is an evolving role. It is becoming more strategically significant and is focusing more purely on information use and flow – which means that there is less emphasis on the purchase of IT systems.

At the same time, services are getting easier to buy. Companies are offering complete solutions that can be delivered using a web browser so absolutely no infrastructure or software is required – beyond Internet access.

So business heads are getting far more involved in specifying what they need and even going to the market and purchasing it without any involvement from the IT department. In fact, if there is no IT infrastructure requirement then why would the IT department need to know what is being purchased or used by the business?

This has always been the case in BPO. The person buying a new HR system was the HR director – not the CIO. They might purchase a system in communication with the CIO, but ultimately the decision was that of the business line head.

Now the same is applicable for a wider variety of systems – even technical systems that would previously have needed agreement from the CIO.

Does it make the CIO redundant as a function? I don’t think so as there still needs to be a strategy around infrastructure and security, but this does signal a complete change in the way companies use IT. The business user not only has the budget, but the power to buy, install, and maintain their own systems without any IT department interference.