Pimp Your Hardware with Kingston Tech

Big data analytics hardware doesn’t need to cost a fortune, commoditised hardware along with upgraded existing assets can be used to drive big data.

The vendors will do all that they can to dissuade you from installing other providers hardware, suggesting that commoditised hardware provides insufficient reliability and that upgrading your existing hardware (with memory other than their own) risks voiding the warranty on the equipment in question.

Though thanks to Kingston Technology, we are happy to put your mind at ease. As much as sales representatives might attempt to coerce you into purchasing memory modules from them, usually at much higher prices than you can get elsewhere, a ploy known as a “tie-in sales provision”, such provisions are generally illegal. Indeed they are specifically prohibited in the consumer market by section 102(c) of the little known Magnuson-Moss Warranty Act of 1975 and can also violate sections 1 and 2 of the Sherman Antitrust Act. This useful blog by outlines both these legal rulings in more detail.

To remain competitive in today’s hyperconnected, saturated markets, firms of all sizes must look at their use of data and how they leverage not only their own data (sometimes their greatest asset), but also the massive stores of third party data (from twitter traffic to weather data). Optimising their use of both kinds of data, probably within hybrid cloud environments, will enable them to achieve deeper customer engagement and streamlined omnichannel commerce, gain insight into clients behaviour and optimise effective and cost-efficient client outreach.

According to an IDC study, the amount of data in the world is set to grow ten-fold in the next six years to 44 zettabytes or 44 trillion gigabytes. By leveraging next-generation technologies to harness this data, firms can gain insights into the behaviours and preferences of customers, as well as into how they interact with the world – uncovering hidden insights that lead to new revenue opportunities.

Pimping your gear to build a DIY Big Data Machine

There are various guides available that can take you through the steps required to build a big data analytics platform, such as this one on Hadoop. 

Among the main considerations is the core platform. Whether this is hosted on premise, in a remote data centre or across a hybrid environment, it needs to be tuned to provide the necessary raw power as well as scalability. In power terms it needs to be able to ingest information at speed. The system also needs to be able to determine whether this data is immediately available for analysis, and what the level of potential latency is and how this might impact real-time applications and MapReduce speeds. You also need to understand how the platform can be scaled to meet ongoing demands in terms of number of nodes, tables, files, and so on.

There are vendors that would suggest that this means that you need to go out and purchase racks of powerful new servers from them. However, the mega scale vendors (such as Google and AWS) have shown that big data engines can be built using commoditised hardware, and as a further option you can also tune up your existing hardware to provide the horsepower to make the most of big data analytics.

Our recommended steps are:

  1. Assess your requirements – the Hadoop buyers guide in the link above provides a good methodology for this.
  2. Look at how you can revamp your existing kit – companies like Kingston provide low cost memory upgrade options to allow you to pimp your existing kit to provide the power and potentially also the scalability to drive your new big data platform.
  3. Build out as you need to, using commoditised hardware – just as the mega-scale cloud vendors do.

Hopefully this will allow you to make your budget stretch further and gain the proposed value from big data insights for less than you might think.

If you want to discuss pimping your hardware with Kingston Tech, visit them at Cloud Expo on March 11th and 12th, at stand 512.

+ posts

Meet Stella

Newsletter

Related articles

The Metaverse: Virtually a reality?

Metaverses have the potential to enable virtual worlds to expand beyond the gaming genre to encompass all manner of social and commercial activities.

Cybersecurity and Cloud: A Look Back at 2022 and What to Expect in 2023

Businesses are continuously reassessing their resources and options to fill their tech stack. In this competitive digital landscape, the innovative use of technology will be something that would generate a competitive advantage for organisations.

Shopping for Data: Ensuring a seamless user experience 

This combination can drive a business’s data culture and provide a structured approach for businesses to benefit from data intelligence across their operations, with only a few clicks.

Unveiling the Top 10 Cybersecurity Threats to Watch Out for in 2023

As technology advances, so do cybercriminals' methods to gain unauthorised access to sensitive information. With the increasing reliance on technology in both personal and professional settings, it is crucial to stay informed about the top cybersecurity threats to watch out for in 2023.

Is sustainability ‘enough’ from a Cloud perspective?

The idea of uprooting entire sustainability initiatives that took years to formulate and deploy is unsettling for businesses but, in truth, it doesn’t have to be so revolutionary.

1 COMMENT

Comments are closed.

Subscribe to our Newsletter