“Big Data” – In recent times, these two words are enough to get people all excited. However, the first word, “Big” doesn’t mean that big data is limited to just “bigger” companies or for organizations catering services to even “bigger” audiences. Big Data means gathering large sets of data, processing and analyzing them as per our business needs, and then systematically putting it to use for maximizing business opportunities and subsequent profits.
Statistics show that revenue generated from big data has skyrocketed, and we have proof. In 2015, it was responsible for profits amounting to US$122 billion. It’s expected to generate US$189.1 billion by the end of 2019. Hold on, and we’re not done yet–these numbers are expected to reach a whopping US$274.3 billion by 2022. Woah! Now that’s an enormous number.
Big data is helping businesses from various domains like healthcare, banking, media, retail, energy, and utilities grow at an unprecedented rate. But in the middle of all this happiness of having limitless power in our hands, we often overlook the pitfalls associated with it.
Companies working on big data handle vast chunks of user data and personalized information to analyze certain trends. This introduces some pitfalls that we were talking about; data breach, cybersecurity compromises, and privacy lapses are the biggest challenges of having a secured big data environment. Let’s have a look at the challenges by assessing the security implications in various phases of the data science lifecycle.
Security Concerns in Data Science Lifecycle
The data science lifecycle doesn’t follow a set guideline as such. While this “wheel” generally moves through the phases in a set order, it may be possible to move in either direction (forward, backward) at any stage in the cycle. Work can take place simultaneously in several phases, or you can skip over an entire phase if required. In addition, if new information is discovered, work may return to an earlier or the first phase of the data science lifecycle.
A simpliﬁed data science lifecycle typically encompasses ﬁve phases, each of which will have unique security implications, while also being supported by a foundation of strong organizational security policies and practices. Let’s have a look at them.
Have you empowered the appropriate legal, privacy and compliance controls for employees who are authorized to acquire data?
Ex. Data Engineers have received privacy and data security training.
Have you validated the data, labeled it effectively, and documented it as part of your data inventory?
Ex. The Privacy Impact Assessment (PIA) has been updated to align with the organization’s EU General Data Privacy Regulation (GDPR) compliant processes, which may be a targeted subset of your overall data inventory.
Do the authorized employees know when there is an outlier and how to engage the necessary process to adhere to data handling practices?
Ex. A free form ﬁeld or unstructured data that should be encrypted is surfacing sensitive customer details in clear text, and your analyst knows how to communicate, and to whom.
Action / Insights
Are access and authorization controls adequate to manage the distribution of a report?
Ex. Distribution of artifacts is through ChatOps, Email, or Shared Links that are updated through automated processes; of these processes are reviewed every quarter.
Have you deployed automated capabilities, which analyze actual data through Collaboration Tools, Applications, APIs, Services for passing data that had not previously been authorized?
Ex. Third-Party Plug-ins have been enabled through click-through software agreements by the marketing team in the organization’s Customer Relation Management software, but access is granted to read-all data.
Security Controls for Big Data Security
As seen above, in each phase of the lifecycle, there are security loopholes that need to be monitored, assessed, and plugged correctly to prevent explicit and highly sensitive data breach and loss. There’s a whole bunch of security controls that speciﬁcally support big data security that should be reviewed from time to time, and potentially strengthened, as part of big data security review. Here are a few of them:
Access Control and Authorization
Organizations with valuable data need to monitor who has access to what data where. Implementing role-based access control, using strong authentication methods, and maintaining robust and auditable access control policies and procedures is a critical part of mitigating insider threats.
Reducing the potential for unauthorized data usage by trusted insiders is particularly crucial for any company working with big data, given the high value of the information individuals are authorized to work with. Standard due diligence during the hiring process is one mitigating factor. However, the assurance process should continue even after onboarding as both personal circumstances of employees can change, as well as their access levels and roles.
Compromised endpoints are often the primary vehicle for data leaks. Sensitive data can be exposed through device loss or theft, as well as users intentionally or accidentally not following security policies, for instance, accessing information via unsecured wireless networks. As such, extending data protection to the endpoint is critical. This is often accomplished by creating personas for the data team coupled with authentication proﬁles and insider threat monitoring.
Maintaining Data Hygiene
As data science moves into a more strategic business function, data scientists and business managers are asking for more and more access to the data and the systems that support its collection, analysis and reporting. Security teams seeking to both enable the business and protect its data need to apply appropriate data hygiene practices to empower the data science team without compromising security or the integrity of the data they use.
Data should be encrypted, both at rest and in motion, and the encryption must extend to endpoint devices. Security teams should follow encryption best practices for sensitive information and be mindful of ensuring proper key management as a poorly performing key management system will compromise even the most robust encryption algorithms.
Use Penetration Testing to stay one step ahead of hackers
Infrastructure penetration testing helps in giving critical insights into your business database and associated processes and helps keep hackers at bay. Penetration testing is a simulated malware attack against your computer systems and network to check for exploitable vulnerabilities. It is like a mock-drill exercise to check the capabilities of your existing networks and processes. Penetration testing has become an essential step to protect IT infrastructure and business data.
Penetration Testing as a Solution for Big Data Security
Penetration testing involves six stages:
- Preparation: Scope definition of the pen test to be performed takes place in this stage. Accordingly, all the parties involved in the engagement are prepared.
- Kick-Off: The kick-off call is generally a brief 30-minutes call between the customer and the pen testing team. It’s a confirmation that everyone involved has understood their roles and good to begin the pen-testing.
- Testing: The first two stages are necessary for having a clear scope definition. But this is the moving stage. Here experts analyze the vulnerabilities and try to exploit the security flaws.
- Reporting: Pen testers have done their job. It’s now time to formally put down all the findings together and report them to the customer’s system administrator or product manager. This should be an interactive and on-going process. Changes should be updated along with recommendations for the fix.
- Re-Testing: Once the Customer is aware of the security issues identified during the pen test, addressing each issue happens over the course of the next few weeks and months. A re-test should be carried out to measure the effectiveness of the preventive measures. If any issues are still persistent, then go back to the reporting stage. Continue the last two stages until the vulnerability is completely fixed. Marking the issue(s) as closed, completes the sixth and final stage of penetration testing.
Sometimes vulnerabilities are there in plain sight of the system and network administrators and yet go unnoticed due to a large amount of big data. Thus, your company needs specialized services to fill in the cybersecurity vulnerability holes that exist.
Cobalt is one such solution provider that offers penetration testing as a service for modern SaaS businesses.
Cobalt offers a modern application security platform that provides a find-to-fix workflow for all penetration testing and vulnerability assessments carried throughout your organization. The ease with which it provides a clear, detailed, and actionable report of its findings through the Cobalt Central app is quite impressive. It also displays vital data in a visual format to convey different types of critical vulnerabilities. Not just that, Cobalt specializes in providing constant support and evaluation of issues reported until they are fixed by your system administrators or other responsible IT personnel.
Businesses are evolving and proliferating by using big data as a tool for achieving their goals and profitability. With the advent of artificial intelligence (AI) and machine learning (ML), there seems to be no stopping for this gentle giant called “Big Data.” You also need to consider the big data security implications. Frequently monitoring and improvising processes will bring in both monetary and cybersecurity benefits for your company. Periodic penetration testing can help ensure that your big data program is working efficiently and optimally. For this, security platforms like Cobalt, that have a robust and dynamic penetration testing and vulnerability assessment services on offer should be implemented to fill in any possible gaps and bulletproof your business from cyberthreats.
See the Full Guide for more information on Big Data Lifecycle.