Inthisepisode,JarellOshodi,DeputyChiefPrivacyOfficerfortheCentersforDiseaseControlandPrevention,discussestheroleofprivacyinthecybersecurityfield.Asanattorney,shebringsauniqueperspectivetotheconversationonhowwecanworkbetterwithourprivacyofficers.Spotify:https://open.spotify.com/show/5UDKiGLlzxhiGnd6FtvEnmStitcher:https://Why Privacy Matters in Cybersecurity_HackDig : Dig high-quality web security articlesHackDig" />

HackDig : Dig high-quality web security articles

Why Privacy Matters in Cybersecurity

2022-04-11 06:04
span class="entry-content post-content">

In this episode, Jarell Oshodi, Deputy Chief Privacy Officer for the Centers for Disease Control and Prevention, discusses the role of privacy in the cybersecurity field. As an attorney, she brings a unique perspective to the conversation on how we can work better with our privacy officers.

Spotify: https://open.spotify.com/show/5UDKiGLlzxhiGnd6FtvEnm
Stitcher: https://www.stitcher.com/podcast/the-tripwire-cybersecurity-podcast
RSS: https://tripwire.libsyn.com/rss
YouTube: https://www.youtube.com/playlist?list=PLgTfY3TXF9YKE9pUKp57pGSTaapTLpvC3

We often hear how the field of cybersecurity is extremely broad, encompassing skills that range from those that are highly technical, to the administrative, and advisory roles.  The cybersecurity profession continues to evolve and grow.  With all of the new regulations that have emerged in recent years, privacy has also become part of the cybersecurity protectorate.  This has created even more broadening of the traditional roles that make up the profession.  However, privacy is larger than security.

I recently had the opportunity to speak with Jarrell Oshodi, who is the Deputy Chief Privacy Officer for the Centers for Disease Control and Prevention.  Jarell is not the typical cybersecurity professional.  She is an attorney; a skill which brings greater insight to privacy protection, and to cybersecurity as well.  Her perspective is important for all cybersecurity practitioners.

Tim Erlin:  Thanks for speaking with me, Jarell.  I am glad that you are here. I generally speak with people about information security, which is a pretty broad topic, and that topic can include privacy. But I, I think there is really a distinction between privacy and security. When you look at the details, the two disciplines of security and privacy don’t always align with each other. I want to start with understanding what the difference is. How do you see privacy and security as different? How are they distinct?

JO:  With the security professionals that I work with, they’re generally concerned with what they call the CIA triad, the confidentiality, integrity, and availability of data.  They want to make sure it doesn’t get in the hands of bad actors. They want to make sure it doesn’t get tampered with, and that it’s available when we need it.  However, privacy professionals tend to focus on the rights that individuals have to control their personally identifiable information, and how it’s used.  So basically, a security professional is protecting against malicious threats, while on the privacy side, we’re more focusing on the life cycle of data, how the personal information is collected, shared, used, destroyed, and retention policies.  Not just information, information that includes personal identifiers.  Keeping personal data away from cybercriminals, doesn’t automatically make an organization compliant with data privacy regulations, if that makes sense.

TE:  Yeah, that makes perfect sense.  That always makes me think back to the payment card industry, and the PCI data security standard. But, when you dig into it, you have to remember that it’s there actually to protect the card brands, not the organization. So understanding the motivation behind the controls and protections in place, even if they’re the same is really important. It changes what the objective is.

JO: Most definitely. And, in privacy, especially with regards to privacy impact assessments, or what some people may call data protection impact assessments, we are definitely looking at the controls. We’re looking at technical, administrative, and physical controls. We collaborate with security professionals to find out what they think about these technical controls.  This is especially true because I’m not as technically versed as my counterparts so, it’s definitely a collaboration.

TE:  Let’s talk about the role of a privacy officer, because it’s different from being a security analyst. What is your job as a privacy officer?

JO: It encompasses a myriad of things, all identifying PII. For example, keeping up to date with our data inventory, and implementing privacy by design.  We want people to reach out to us and contact us to ask us questions, and seek guidance.  When new products are being thought about, or new systems are being thought about, we want privacy to be embedded into the entire process. We don’t want to discover privacy gaps after the fact. Correcting that costs more money than if it was considered throughout the process.

It is important that employees, third parties, and customers are notified, and given the opportunity to consent, as well as the opportunity to withdraw consent. Developing privacy operations altogether, from trainings, tabletop exercises, and contracting is a big part of it as well. This is heightened, especially when personal identifying information is involved. We want to make sure even with a data use agreement, if, for example, we’re sharing information for research, we still want to make sure that the party we are sharing information has the proper controls in place, and they’re going to take as good care of this data as we would. We also want to make sure that they aren’t going to share it with other people. We want to make sure that risk assessments are done.

So, as I spoke about data protection, we want to make sure that when a new system is being developed, we’re performing a protection impact assessment, but also when a particular authority decides to use the same system, but to collect a different type of PII, or more PII, how that is going to be used.  You may have a system where no social security numbers were involved, and now that system will be used to collect that information.  That requires different controls.  Or, in the case where the PII is already present in the system, and is now going to be used for a different purpose, that requires what they call “fresh consent”.

Also, collaborating with the security function when it comes to data incident response is very important. For instance, our security department incurs different types of data from incidents and breaches.  We may not be needed if it doesn’t involve personal identifying information, but if it does, we are immediately notified, and we have to mitigate those risks and determine how notice should be given to those affected.  

More related in the government sector, we have something known as the system of records notices, and those are published in the federal register.  A system of record is basically just a system where information is retrieved by using a personal identifier, such as a social security number. So, if it’s considered a system of record, then we have to provide notice to the public, then allow a 30-day comment period before the system is allowed to operate.

 We work with all the business units when privacy is involved.  We are definitely not siloed at all.

TE:  You have to be anywhere there’s PII.

JO:  Exactly. Marketing, Finance, Human Resources, Research and Development.  You name it, PII is likely involved.  We would love to minimize the use of data. That’s our main goal.

TE:  Yeah. Shrink the environment as much as possible. So, is it highly valuable for you to have cross-functional knowledge of how those different functions operate in order for you to do your job more effectively? It seems like it would be.

JO: It’s important to have cross-functional knowledge, but it’s more important to have cross-functional relationships, because I don’t know everything. I’m not interested in knowing everything, but I’m interested in making sure all of these different business units know that my door is open. The lines of communications are always open.  By reaching out to me, they are making my life much easier, and I can also create privacy champions in these different areas. The more privacy champions you have, the more trainings you can do, the less risk for human error, and incidents. Reciprocally, they notice, they see how, as we work together more, we can create FAQs, flow charts, and other things like that, where they are empowered by the knowledge that they have because of our relationship.

TE:   That’s important. The relationship aspect is one that I hadn’t really thought too much about. When we talk about cybersecurity, most security analyst roles come from a technical background. It’s very common for people to start from an IT role, and they move into a security role from there. Your background is in law as opposed to technology.  Is that right?

JO:  Yes. I am not technical at all. I rely heavily on our technical privacy analysts, and our compliance, privacy analysts. It’s a team effort. I know the law, I know how to apply the law, and operationalize the law.  But also, I know how to be resourceful and leverage those that do have the technical backgrounds to translate what I would like, like to implement. They are the translators between the technical, and the compliance.

TE:  What do you wish information security professionals knew about your job as a privacy officer?

JO: I wish that they knew that the job does not just involve data incident response. I wish they knew that outside of data incident responses, I am working with all business units across the organization, giving privacy guidance to people, interpreting new privacy laws that are constantly changing. I mean, these privacy laws are literally constantly changing. I wish they would understand that privacy is a much bigger picture.

TE: Do you wish those security practitioners saw you and other people in that privacy officer role more as a resource to go to for that kind of information and for help and understanding what the privacy implications are?

JO: Yes, most definitely, because we have to collaborate in order to do our job well as well. So, we need security, but our collaboration is key with them. We love our security professionals, but our world is just much bigger. I sometimes feel as though some security professionals don’t realize all of the things that we’re tasked with, and all of the missions we’re on, and all of the best practices that we are trying to get done in all of our success measurements as well.

TE: How has your career informed your view of privacy and privacy rights? For example, how do you feel that view of privacy and privacy rights has changed from when you started your career in privacy, to today?

JO:  I understand the balance of needing to get something done for the greater good, while also protecting an individual’s rights.  We have technical, administrative, and physical controls in place, but receiving fresh consent when a person’s information is used for an additional or new purpose is part of the balance.

TE: Do you think that the people understand what consent means when they give it, such as when they click a box on a paper-based, or online form?

JO: No. This is why I’m glad that there is the push, especially through laws like GDPR and the California Consumer Protection Act. The push for plain language is great. I love that they are trying to get rid of legalese, and technical terms. It’s important that we actually understand what people are doing with our data, and how they are profiting it from it.

TE: For someone who’s interested in privacy as a career, is that law background a requirement?

JO: It’s definitely not required. In the privacy realm there are generally attorneys, people with legal background, or legal-adjacent type positions. Privacy compliance is where a legal degree helps.  But there’s also a new, growing field and there there’s actually a new certification for it. The role of Privacy engineer is a growing field. They are the tech people who are the translators. They’re the best of both worlds. We have the security folks, and the people who are more technical, but they may not be able to communicate as easily with someone like me who has may speak legalese.  The privacy engineer is the perfect middle person to help us get the job done.

TE: Yeah. That’s really interesting. You obviously work at a large government agency, but privacy isn’t something that’s exclusive to government.  What are the key differences between privacy considerations for a government agency, versus a commercial organization?

JO:  In the government, the Privacy Act of 1974 is the main law that we follow, and operational law ties and under it. Under the Privacy Act, there’s the E-Government Act of 2002. That states, that for every system, a privacy impact assessment is required. Those system of records notices I spoke of are required. People can request information that a government agency has about them in a system of record, through a Privacy Act request.  We are obligated to respond in a certain number of days.  

In the private sector, there’s all of these sectoral privacy laws, such as state privacy laws. Other countries have their own privacy laws that have the same principles. For instance, GDPR has a long list of individual rights that a person has with organizations.  A person can request access, deletion, correction and other information about the data that a company has about them.  The company is given certain amount of days that those requests must be responded to. Unlike the government, there are no system of records notices required or things like that. There’s no mandate that a privacy impact assessment must be completed for every system that a company has. With GDPR, a Data Privacy Impact study is only required when sensitive PII or high risk PII is involved.

Those are the obvious differences.  Also, with contracts, there are differences between the government and the private sector, mainly because there’s no federal privacy law, and every State doesn’t have a privacy law. Lots of the data protections that are in place are parts of a clause in a private contract.

Even though the Privacy Act of 1974 predates much of the technology that we’re using today, it still provides the foundation for the privacy requirements and practices that we apply today in our much more connected world.

TE: How does that work actually?  That’s fascinating.

JO:  The Privacy Act is based on the fair information practice principles, and those principles are actually what I personally feel all of these privacy laws are based upon. The government preceded the private sector by many years when it comes to the whole premise of data handling, including data collection, limitation, data quality, and purpose specification – where you need to state the purpose or the reason that you need PII, the use limitation – saying that you can’t disclose it, the security safeguard principle, the openness principle, the individual participation principle.   Those are all of the rights that individuals have with regard to their information and the accountability principles. Those are all saying that the person who is controlling my data has to be accountable by complying with whatever regulation, or measures are in place.

The Federal Information Processing Standards (FIPS) is really what all privacy laws are based on. I find it interesting that, um, 10 years ago, when I was handling privacy access requests for individuals and, requests for individuals now, I always felt like I was speaking another language when my friends in the private sector were unaware of how many rights we had with regard to our personal information. Now, many of them understand.

TE:  Yeah. It seems that, from an external perspective, GDPR was a watershed moment for changing the public perception of data rights.

JO:  100%.  Most people didn’t even know that data privacy existed in that sense.  GDPR also caused companies and corporations to manage their data better, because if it was mapped properly, using a privacy by design approach, it is created in a way where if you need all the information on Jarell Oshodi, for example, the data is mapped in a way that you can see what systems it resides on, and you’re able to carry out that action or carry out that process.

TE: Yeah. If you’re required to delete all the data on an individual, you better be able to find all the data.

JO: Exactly. And that’s how it starts. That’s why data inventory is the very beginning. You can’t manage what you don’t know you have.  You also can’t, as far as data incidents and data breaches.

TE: It’s an interesting corollary to the, the security phrase that “you can’t secure what you don’t know you have”.  The same is true for data and privacy.

JO: Yeah, definitely.  They definitely overlap.

TE: You’ve been in this career in privacy for a long time. What’s the biggest change in privacy law that you’ve seen?

JO: The biggest change is the awareness of individual rights, the increase of people being aware of their right to access data about themselves. GDPR, I believe is the reason. In my field, with the government privacy act, people have always had access to their personal information, but with GDPR and people wanting to be compliant, and even businesses that aren’t based in Europe, it has caused businesses to prepare and to set their business up in a way to respect personal information rights.

TE: So GDPR didn’t necessarily change the rights that people in the US had in terms of data, but it increased their awareness.

JO:  Exactly.

TE: From the privacy side of the industry, what lessons do you think information security can learn from privacy?

JO:  In privacy, we tend to focus on best practices. It would be nice if the security folks understood that our job isn’t done just because we’ve mitigated a breach, or because we prevented access in a particular place. We’re always focusing on best practices, minimizing data where we can, just constant risk assessments when data is used in a certain way.

TE: You touched there on the idea of minimizing where data is used as a means to essentially shrink the footprint of what you have to be concerned about.  That seems like something that the security people could look at towards the concept of minimizing the attack surface area, which would also reduce the amount of work to secure an environment.

JO: Yes. The less PII involved, the lower the risk.  If a system is breached and none of the information involves PII, then the risk tends to be a bit lower. For companies, it’s their reputation at risk. It’s the trust of their clients.  That’s a competitive differentiator these days. So, the more de-identified or anonymized information we can use, the better.

TE:  Excellent. I want to thank you, Jarell, for speaking with me. I learned a lot about privacy, and what it means to be a privacy officer, and I really appreciate it.

JO:  Thank you. I appreciate you asking me, and I appreciate discussing it as well, because it lets more people become familiar with what I, and other privacy specialists do.


Source: /ytirucesrebyc-ni-srettam-ycavirp-yhw/tsacdop/ytiruces-fo-etats/moc.eriwpirt.www

Read:2336 | Comments:0 | Tags:Podcast legislation Privacy Privacy Act security cyber cyber

“Why Privacy Matters in Cybersecurity”0 Comments

Submit A Comment

Name:

Email:

Blog :

Verification Code:

Announce

Share high-quality web security related articles with you:)
Tell me why you support me <3

Tag Cloud