Welcome to cash in the cyber sheets. I'm your host, James Bowers, and together we'll work with business leaders and industry experts to dive into the misunderstood business of cybersecurity and compliance to learn how to start making money from being secure and compliant. Welcome to cash in the cyber sheets.
Hey, everyone, welcome to cash in the cyber sheets. I am your host, James Bowers, chief security and compliance architect here at Input Output. Very excited to have you on again.
So today have a few things to go over. The first things just really want to give some updates about what's going on with Input Output, what's going on with cash in the cyber sheets and some of the different changes that we're making all the things going on there. So we'll definitely jump into that also want to go into some new legislations that is coming out relating to Indian end to end encryption e2e and long story short of it because we are going to get into it.
There's looking to be more regulations about how that's managed how that's controlled and the intent is to help reduce child trafficking child exploitation. So I'll also throw this out there while we're not going definitely not going to get it explicit on that type of topic. I do want to just throw out a quick warning that if any type of discussion about child trafficking child exploitation things like that is cannot can upset you which it definitely can we're going to be talking about that a little bit today, but really more from a security side more from a what can be done type of side and also going to give a lot of resources for how you can talk to your families about it how you can help protect your own kids something that that we really focus on quite a bit.
So very passionate about that have kids of my own. So like I said, we'll jump into that before we get over there and I don't know. Maybe it's a moot point because we're not on Apple podcast or Google play yet.
So asking you to click the like button is a little difficult to do but eventually we are going to be on there. So if you're on Apple podcast, you're on Google play you're listening to one of our this episode or an older older one on there. Once we once we get it up and running, please click that like button subscribe tell your friends about us talk about us at dinner talk about us during Thanksgiving.
It really helps us build the show and also in the comment section, let us know some things that you would like to hear about things that you would like to talk about discuss dive into very happy to to have back and forth and get different opinions on things because with the risk management side with cybersecurity, it's really a seesaw of risk reward cost benefit and you can make arguments on both sides. So very very happy very excited to have any of your comments. Please put those in there.
Please subscribe wherever you're at and let's go ahead and jump into it. So first things first is we're actually starting not starting we started our new website. We're completely redesigning the thing I basically in looking at how we can better market internally and I am not a marketer that is that is not my thing, but I'm not going to give myself too hard of a time about that because I don't know that's that's detrimental like the secret you want to put those I am statements out there.
You don't want to keep telling yourself that you're not something so I guess I am a person that is constantly improving their marketing skills. I'll go with that but currently I'm not an expert but maybe one day I will be so dove into where's that book at? I'm not going to get out now. It's by Donald Miller building a story brand and Donald Miller marketing made easy really really great books really easy reads but when looking at those really kind of opened up my eyes and some of the people we work with that we just really weren't doing a very good message with input output and also tying into cash in the cyber sheets.
So kind of went on this this whole whole program of rewriting everything and really trying to figure out how we can easily explain what input output does and that resulted in really kind of taking the entire website that we already had that we paid quite a bit of money to do. We're very happy with the team that put it together but taking that whole website and basically checking it and starting to build a whole new one. That's just very very to the point very simple not flashy, but just here's what we do.
Here's how we do it and here's how you can pay for us to do it for you. Here's how you can work with us. So we just got that up and running yesterday.
Just a very few pages the home page a very very simple solutions page. But as the days and weeks go by we're going to continue to add pages to that and build in the integrations and the capability to purchase right through there to get started with our programs right through there and that new platform that we're that we're building everything on actually a good job II. We're going to be able to do all of our our instructional videos right on there and all of our other courses and be able to pull a lot of content in there.
So that's really exciting to be able to pull that all into one kind of a one-stop shop. Which should also make it easier for us delivering everything and in our clients accessing everything. It also supports on Kajabi in it.
I'm not not a paid marketer for them. It also supports managing the podcast. So we're toying with some other platforms before we're definitely going to keep posting everything on YouTube mostly because I like the logo.
I actually think the logo in the intro. I think that we did a pretty good job with that. But we're going to start managing the podcast through Kajabi, which should make it easier for us to get the RSS feeds and like I brought up before that should also allow us to get on to Apple podcast Google play all those other places so you can actually click those likes and just access it wherever it works for you.
So those are some of the big things going on behind the scenes. We're also moving full steam ahead with the compliance program, which people are starting to jump into which is really nice put a lot of time into it and people are finding value in it. So that makes me feel good and we'll keep getting all of that updated.
So if you're listening to us on YouTube, probably not too much is really going to change there. But if you were not listening to us on any other platform, you'll be able to soon here and also we'll be getting the cash in the cyber sheets pages and site fully up and running too. So that whole social marketing and social involvement involvement all of that is also getting built out.
So really excited about all that. So actually getting into our topics today wanted to talk about the new legislation that's being proposed related to the stop stop. See Sam act.
Okay, that's CSAM. That's child sexual abuse materials. And basically this is being proposed in the US Senate to I don't want to use the word improve.
I just I want to stay neutral on how I'm explaining what this is. So it's introducing new requirements and more restrictive requirements related to how CSAM is managed how other child trafficking type of things are addressed and managed and I guess at its core. It's really looking to expand on.
What is it is? You would think I'd have it right here. There is section 230 in as a as a quick background to section 230 basically. In a nutshell it gives immunity a safe Harbor for social platforms other sharing platforms that they're not responsible for what's posted on their platforms.
If they're basically if it's a forum site or something something else like that if somebody posts something inappropriate or illegal you can't go back to that provider like say a Facebook or any other platform. You can't go back to them legally. They're not responsible for it, which definitely I can I can understand that that definitely makes sense.
There's there's way too much for them to be able to monitor and just just at the core of it. If it turned into a thing that they had to monitor all of that or they had to manage it then it just turns into an opinion platform for for that social platform because they would be regulating everything that went on there. So I can definitely understand why it's there can definitely respect some of the reasons why it's there.
I can also fully understand some of the arguments against it in that a lot of inappropriate material things that can exploit kids is left up for too long or not addressed at all on some social platforms and in in those cases section 230 is being used is basically a shield that we don't have to do anything because we're not responsible and right now legally, that's true. I think from a personal standpoint of you, there's what you have to do and what you should do. So I'm not sure if I'm definitely not a privy to all of those interactions all of those discussions and back and forth but with some notable cases that were child exploitation information has been left on social platforms for a very extended amount of time.
I don't know. I just I feel like the social platform could probably do more in those cases, especially in the ones that are kind of getting getting brought to light or getting a lot of attention. I would also think it'd be in their best interest.
So that the stop see some act is that's kind of what its intention is now the argument against it and this is where it turns into that seesaw of cost benefit reward benefit negative impact against ROI on the security side where it comes in is the stop see some act would also introduce impacts to end-to-end encryption e2e and the language in there and the arguments against it is that if this goes through it's going to require that basically social platforms law enforcement will be able to look at any type of communication and that they'll need to be able to look at ones that are encrypted to be able to see if there's inappropriate content. So definitely on the side of protecting kids. That's that's a good thing on the side of completely eroding encryption capability and security and intent that's not so much of a good thing and I guess is bring up a few of the different topics here.
We'll talk about it. But with how some of the social platforms are how some of the cases have gone to where they haven't really acted even when really they've got a lot of pressure. I don't know if this another law would help.
There's already laws against that there's already requirements that social platforms have when they become aware of information on there. So even if we're saying that they don't necessarily need to track it all down and police it maybe that would be too difficult at least when they're made aware of it. They need to act section 230 other laws already have that in place.
So that seems like more of a maybe we should just enforce that rather than creating another law that we really really mean it. I don't I don't know how much utility that's going to add with the exception of potentially adding in a lot more roadblocks for businesses introducing other really negative aspects that could be that could be used against businesses or even unintentionally just really gum up the works and Also at the end of the day like we've talked about it probably be easy for a lot of bad actors to sidestep these things. There's a lot of ways to sidestep all the laws.
So when we put things in place like this and I'm definitely not are ever arguing against protecting people protecting kids, but when we put things like this in place, we really want to make sure is this going to have the impact is it going to execute the intent that we want or is this just going to gum up everything and make things more expensive for everybody else that wasn't already breaking the law and it's not really going to impact those that are they'll just they'll just find a way around it. So I guess going into some of the arguments for the platform is there's I mean, there's the biggest is we want to stop child trafficking and where some of this is there's very specific laws around CSAM material child sexual abuse material. There's there's a lot of that already in place and definitely on the record to give Facebook because Facebook and Meta is called out quite a bit in some of these articles of the 34 million pieces of Exploitation content that was found on Facebook and Instagram in the final three months 34 million in the final three months of 2022.
That's terrifying but of that 34 million 98% of it was detected by Meta itself. So definitely kudos to them. They have platforms and systems in place to be able to identify mostly pictures and I guess videos to be able to take those out the scrub them out to take action.
So that's a very good thing where the argument is in where this article from the Guardian is discussing is there's a legal inconsistency in the fact that the child sexual abuse material and images they must be reported but technically reporting child sex trafficking isn't legally required. That's obviously a major problem and it's concerning not just Meta but across the board how little trafficking is being reported. I think this falls into companies are prioritizing what's legally required.
That's actually the statement from the from the article. And that's really where the big concern is. So definitely images are are very very bad, but it's not necessarily the image itself.
It's the impact to the kids to the to the ones being exploited and that typically goes into a next step of the exploitation. And if that's not being policed, not even policed, if that's if we're not taking action when we identify those things, that's a big problem. So that's that's what the stop CSAM act is is all about to help manage the the trafficking and that reporting inconsistency.
Now, let's see. There's a few other things and I want to make sure that with some of this I'm not misquoting. But according to the National Center for missing and exploited children NC MEC between 2009 and 2019 Meta reported just three cases of suspected child sex trafficking.
And that was that was identified in an actual subpoenas and the NC MEC actually said that yes, that is true. Now. They do want to point out in the article talks about this to as well as the NC MEC that there's a lot of other type of incident types that are being reported by Meta and other platforms.
So that number could be skewed that you know, perhaps in only three instances. That's what was selected and all the other ones some other incident type was but that just seems wild that in the 34 million pieces of content that were taken down between the last three months October November and December of 2022. There were only three cases identified in a 10-year period between 2009 and 2019.
That seems wild just even statistically even by accident if if we were told internally, hey, this is the incident type that you use. I feel like even accidentally we should be clicking the child trafficking incident type by accident more than three times. So I think there's definitely a legitimate claim that companies could do better.
Which is which is forcing this or pushing this legislation forward. What's also weird is that there's been several investigations that some of the social platforms Meta hasn't taken enough action and that actually led to even some of the investment funds that owned Meta stock bringing suit against them saying that they're failing to act on systematic evidence. Now.
Taking that and reading between the lines. I mean what you don't really need to read between the lines there. It's it is the line they they they brought suit.
They saw a problem. They're actually doing something but reading between the lines more on that it's there's got to be something going on or the investors have to see something because if I'm an investor. My inclination isn't to do things to damage my investment in bringing a suit like this against a company.
Would definitely impact my investment. So I'll let you draw your own conclusions there. But what has to be going on for me as an investor to say, you know, what I'm going to go ahead and take a little bit of a bath on this and we need to bring this up.
So I'm right now building a lot of the case. I think for the stop CCM act and then we're going to we're going to tie into some things against it. I'm also according to the article in kudos to matter.
They actually have a lot of moderators that are looking at these things when things get reported. They go in and see is it real is it not do we need to escalate in according to some of those moderators and from this article is a lot of times the cases that they would bring forward just would get closed out and the response that they would get was that it was done by an automatic bot that nothing's been done with this case in so long that that it's being terminated much like a support ticket, which I think is is is a weird thing to have related to this type of report or ticket that would probably be something that we would want to keep in place. Until we've investigated it and yeah, maybe we can say listen.
There's such a backlog. All right. Well, I mean eventually we should get to it.
I don't I don't know if shutting them down. It's really the right thing and the moderators apparently when that happens that would give the moderators a negative hit on their accuracy scores, which is one of the metrics that they were evaluated on and would hit their performance reviews, which I can definitely see having an accuracy score for your moderators is part of their metrics. I don't know if having it like this or on this type of report.
Again is appropriate. I would I feel like that falls into the same realm as the the fail open. Let's let's air on the side of safety.
So if our power goes out, we want to make sure our fire doors can can open and they can always open from the inside out in cases like this. If you're if you even suspect you see something say something and we'll investigate it. And if it turns out to be nothing, thank God.
We're happy. We're happy. We spent the money and and looked into it.
There was there was no child being explored exploited. Nothing bad was happening. That's a good thing.
Keep doing your job. Keep reporting But in business and in everything you get what you incentivize. So if we're Incentivizing because it's their performance review.
If we're incentivizing Accuracy and part of that accuracy is cases that are acted upon and a lot of them are being closed just because they're timing out. How many cases how many things are not getting brought up? How many things are not getting pushed forward? That actually were something and just because I don't want to take a hit to my my performance score because I've got a mortgage to pay. I've got kids to feed.
How many how many legitimate cases aren't actually being brought up? So It's it's definitely a big issue there and a lot of the blowback that they've gotten over it. They've utilized section 230 to successfully argue against it to basically have immunity in court. And again with section 230, I can understand that we can't be everywhere at every moment.
I can't obviously be responsible for what somebody throws up or what somebody says. But also on the other side once you're made aware. I feel like you've got Even if it's not a legal obligation a moral obligation to do something.
And it's that's a weird situation as well because If the PII is exposed, that's what we focus on protecting a lot PII, PHI personally identifiable information. If that's exposed under privacy laws, you have to be able to show due care and due diligence the due care that you did you put everything in place to be able to protect it as much as possible and then the due diligence that you're actually following up on that and taking appropriate action. So I it just feels like there would be a due diligence at an absolute minimum a due diligence impact that we didn't do everything we could do when we found out about it.
That that that would introduce some sort of liability, but according according to this article, there's there's a lot of the shielding and the arguments. There's a lot of shielding with section 230. So coming down here.
A bit. Okay. So before we get into the article about this one being really against the stop CCM Act, we'll talk about those.
I guess, you know from my perspective of it again, we really want to make sure that we're taking action that we're doing what we need to do now. I'm hesitant. Obviously based on some of the other talks that we've had about eroding the security capabilities about providing more oversight providing more compliance.
It's just it creates even more of a minefield for business and on the encryption side of things if we're saying in encrypted platforms, you have to be able to identify what's in there. Well to be able to do that you need to store the encryption keys somewhere that everybody can access them. And if more people have the encryption keys, that's more vectors more possibility that that key gets compromised and now other people have access to those encrypted messages, which kind of completely defeats the purpose of the whole end to end encryption of protecting the thing.
So it's it's at this weird point of okay. Well, if we if we have the encryption and we we make it to where nobody can see it. We've got say full zero trust even our the platform doesn't have the keys just the clients.
Well that also introduces the situation where those encrypted channels could be used for inappropriate content or for inappropriate actions and that's that's definitely the argument for the stop CCM act the the impacts to section 230. But at the same time practically everything can be exploited. Every good thing can be used in a bad way the even even healthy food.
If you eat too much of it, if you drink too much water, it could be bad for your health. So I'm not sure where the where it falls. I think that's more of an opinion thing of how much things should be opened up protection against I guess the the liability or the potential for for bad actions, but I feel like with the encryption side once you once you poke that first hole into it, it's it's done.
It's it's no longer it's it's no longer viable. There's there's no there's no real security. There's no real expectation of privacy.
You know, I think that's a that's a completely different topic of a discussion, but that's the concern there on the security side. If we're making it to where we can police it. Fundamentally, we're destroying encryption capabilities.
now The new CSAM bill basically does four main things. It makes it a crime for providers to knowingly host or store CSAM or knowingly promote or facilitate sexual exploitation of children. It creates new civil claims in corresponding section 230 carve-out to promote private lawsuits against internet companies for the promotion or facilitation of inappropriate material.
So basically it gives right of right of action a private right of action. It also requires providers to remove remove in addition to reporting and preserving apparent CSAM material. It also creates a notice and takedown system overseen by a newly created child online protection board.
So don't worry. We're here from the government. We're here to help it creates a whole new board and judicial organization branch for the government.
so The first point of it, it makes it a crime to knowingly host or store. Let's that's kind of already in the current laws now. And it makes it a crime to knowingly promote or facilitate now.
I haven't read all the language. I can definitely understand the trepidation trepidation in that that that's kind of vague. So does that mean that I know my platform can do it? So now I'm culpable or are we are we wanting to take the stance that once you know about it, then you have to act.
I definitely support the latter. If something comes up, please act on it. Please protect the kids.
And it creates a new civil claim because corresponding to section 230 encouraging private lawsuit against companies. Now, I I'm not a lawyer. We've got that posted in all of our policies everywhere.
We don't provide legal advice. I feel like if there was negligence. On a company you can always sue for negligence and that's by the way.
Some of the argument with PHI lawsuits that okay, there's no private right of action. If for HIPAA, if you violate HIPAA, I can't just see you but my information gets out there. I can see you for negligence and I'm going to support my case for negligence in that you're not following federal law.
You're not following HIPAA. So I'm not suing you directly for HIPAA. I'm suing you because you're negligent and your lack of HIPAA compliance is going to win me my case.
I think there's already some of that in place that if a platform doesn't do anything. How much liability do they have once they're made aware and I think digging into the section 230 lawsuits because things like this have come up before. How much are they able to use that as a shield and just say, you know, we're not involved.
We just we just provide the platform. Yeah, we are told about it, but we just provide the platform, you know, we can't be held liable. So I'm not sure the exact extent there and there's the devil in the details as always especially with any type of contract, but that's one of the second parts of it.
It also requires the third part. It requires providers to remove in addition to reporting and preserving quote-unquote apparent CSAM when they obtain actual knowledge of the content on their platform. So again, I think this would require a dissection of some of the two section 230 defenses in that have been used.
But you already can't promote CSAM or maintain it. It's it's got to be taken down and since it it's a crime. You need to be able to support a review of it.
So there's already reporting and preserving requirements in place. I'm not sure what this is introducing differently. The fourth piece is it creates a notice and takedown system by a newly created child online protection board.
Which requires providers to remove or disable content upon request even before there's an administrative or judicial determination that the content is inappropriate. So that currently if a platform gets a notice that there's inappropriate material they unless they decide otherwise it has to it has to go through a legal channels for them to get an actual notice to take it down. And as we talked about before apparently that's not always being done.
What this would do is if there's a if the child online protection board COPB I guess sends something they would have to take it down and then investigate. So this kind of fault I guess that some of the arguments in this article against it is it's it's like a guilty until proven innocent is is their take on it. So I can definitely see the potential for misuse of that.
At the same time playing devil's advocate. I'll say that a social platform the utilization of a platform is not an inherent right. So while I can take down material.
Is that really impacting my my First Amendment rights, you know that that can definitely be argued both ways. So with it with the introduction of the new bill, that's that's what it's looking to do. The biggest argument here is one that it's going to break into end encryption and that basically any online platform now is going to have like this pseudo encryption or encryption theater security theater but if I'm able to gain access to the platform, well, I can gain access to any of that encrypted material because that platform has the keys to it.
So is this the argument here and would this result in there's no more end-to-end encryption capability. Also with the additional mandates would this make it difficult for other platforms to even get started or to engage would it create such a high barrier for entry that now I've got to have so many expensive systems in place to be able to monitor this to be able to review it. Am I even going to be able to to start an application or now is it just the big companies that can do it? And then also there's considerations and always concerned that perhaps frivolous complaints will be brought against the company and it will it will cause communication lines or other platforms to be brought offline while it's investigated.
God knows how long that takes. While it's investigated, they're shut down that could really impact business. So there's definitely a lot of arguments against it.
Bill threatens free speech by creating a new exception to 230. Again, like I said, is it is it really free speech? We talked about the notice in the takedown providers. They got to remove the content.
You know, I kind of agree with this one. They're a big argument in this article is that well, why don't we just enforce what's already there? There's already laws in place. So why don't we just encourage Congress to? Enforce those existing laws and since 2008 providers have faced large fines if they fail to report CSAM.
So they've already got to report this stuff after receiving actual knowledge of its presence on their platforms. Yet, according to this article, they couldn't find any case where the federal government has ever enforced that provision. So it seems like it kind of feels like there's already some protections in place and perhaps at least according to this, maybe we start at let's let's start enforcing that before we before we tear down the before we tear down the encryption.
All of these articles, I'm going to go ahead and wrap it up because I feel like I'm kind of starting to ramble now and we're definitely getting to the end of our time. But with all of these articles, I'll put all of those links in the in the in the comments so you'll be able to check these out. Also, and I'll say this again before we close out, but do check out child rescue coalition.org. That is a phenomenal resource on and solutions on protecting kids online helping to prevent and stop trafficking and as a parent, they've got a lot of really good resources on there about how you can have discussions with your kids, how you can protect your kids.
We actually use a lot of those resources to give presentations and seminars. So again, childrescuecoalition.org, definitely check out what they're doing. Very, very good platform and definitely worthwhile.
As we wrap up here, you know, I think the I think the takeaway here is looking at how much security and protections controls do we want to have in place and how much is that going to erode what we can what we can actually do if we put this in place. It's definitely definitely going to impact our encryption capabilities and if we're completely honest, there's ways that I can even sidestep that as a bad actor. So is this really just going to negatively impact businesses? And not really provide any benefit.
And that's also as we look at any other type of security control to put in place. The lens that we have to look at it through is this actually going to help or is it going to produce more harm than good? I'm still doing a lot of research. So I'm not going to say either way where I'm at on it, but I would definitely recommend you take a look at it and I guess I'm leaning against destroying encryption capabilities because I think that just takes us down a road of eventually.
We're just having all of these kind of check the block box compliance and security requirements, but they're not actually really doing anything because the encryption is not really true encryption where everybody has the keys and I think if we go that way, let's just not even have to check the box and just leave it all open. You know, definitely not the way to go but but again security and compliance and all these efforts they they need to have a legitimate purpose and with anything that we put into our business. We've got to look at it from that lens as well.
So I'll go ahead and wrap up for today. I definitely want to thank you for listening to cash in the cyber sheets. Please don't forget hit those subscribe buttons whenever they eventually show up leave us some reviews and comments.
If there's topics you'd like to talk about please bring those up. I will have all of these articles that we talked about today and all this information put into the comments. So that way you're definitely free to take a look at more of the information.
But again, thank you for listening to cash in the cyber sheets. We'll see you next Thursday. Same time same place.
Thanks for listening.