Category Archives: Software Development

Security News for the Week Ending May 20, 2022

Flaw in uClibc Allows DNS Poisoning Attacks

A flaw in all versions of the popular C standard libraries uClibc and uClibc-ng can allow for DNS poisoning attacks against target devices. The library is likely used in millions of Internet of Things devices that will never be patched and will always be vulnerable. This is where Software Bill of Materials is kind of handy. Credit: ThreatPost

Cyberattack on Hawaii Undersea Cable Thwarted

Homeland Security Thwarted an attempted hack of an under-ocean cable that connects Hawaii with other parts of the Pacific region. While Homeland is not releasing any details of the attempted attack, if the attack shut down traffic, that would be really bad for the region. Just one cable, for example, the Hawaiki Transpacific Cable, runs for 15,000 KM and has a capacity of 67 Terabits per second. Credit: Star Advisor

Will the Mickey Mouse Protection Law Go Up in Flames

Full disclosure: I have never been a fan of this law, so if it goes away, it won’t bother me. As some Republicans try to hurt Disney (trying to abolish the Reedy Creek special district, for example), Senator Hawley (R-Mo) introduced legislation to roll back the insane copyright “terms” that companies have used to make money off characters created a century ago. The downside of Hawley’s move is that it likely will anger a lot of people who make money off that 120 year copyright term and they might choose to make donations to the other team to get even. Given that Washington runs on “contributions” and those donors are likely going to explain that fact, I would say the odds of this passing are not great, but who knows. Credit: MSN

Feds Write Memo That Says They Pinky Promise Not to Charge Security Researchers Under CFAA

Sometimes I probably come across as cynical. That is because I am. While it is great that finally the DoJ wrote a memo that says that they are not going to charge security researchers for finding security holes, that memo only has just a little bit more weight of law than if I wrote that memo. There is nothing binding on the DoJ. Still, I guess, it is better than nothing. Credit: The Daily Swig

Sanctions Have Some Effect on Russia’s Tech Sector

Since Russia can no long buy AMD and Intel processors, they had to find an alternative. The solution seems to be a KaiXian KX6640MA. This is an Intel compatible chip, but it is a bit slow. One CPU Benchmark reported that a 4 core, 4 thread chip scored 1,566 points on the CPU benchmark. By comparison, an Intel Core i3, which is the slowest of the current Intel family, scored 14,427. Not exactly a match and for anything that is time critical, that is a problem. Guess how you would feel if someone replaced your computer with one that was 1/10th as fast. Credit: PC Magazine

Get Ready for NIST’s Software Supply Chain Security Guidance

As part of the Executive Order on Improving the Nation’s Cybersecurity (EO 14028), NIST is required to do several things. among those are guides and standards for improving supply chain security and they have already released a number of draft documents related to their tasks.

IF you sell to the executive branch, these will become mandatory. In some cases they can bypass the FAR process, although there will be some FARs created, and just implement the EO as directives to the branch agencies to do this or do that.

The first thing that they did is create a definition of what is critical software. You can see this document here. It provides both specific criteria for attributes of software that meet their definition and then it provides a list of software types (like, for example, endpoint security tools) that meet these definitions.

Earlier this month, NIST released preliminary guidelines for enhancing software supply chain security. This document, called NIST Special Publication 800-161 Rev 1 was released in draft form for comment. A light weight bedtime read of over 300 pages, it is open for comments until December 3rd. It provides a very rich cybersecurity supply chain risk management (C-SCRM) process and it will only get better with comments.

After releasing this, NIST held a workshop to go over the guidance, which is due to be finalized by February 6, 2022.

NIST has also created a new document titled Secure Software Development Framework Version 1.1, also known as NIST Special Publication 88-218, which is available here. Unlike SP 800-161, this one is only 31 pages.

Perhaps I don’t understand all of this, but here is my take.

IF you develop software you want it to be secure.

IF you sell software to the government, you will be required to follow this NIST process.

If you don’t sell to the government, but your customers sell to the government, you may be required to follow this process anyway.

So, you basically have three choices

  1. Do nothing and see what happens
  2. Create your own secure software development framework
  3. Leverage all the work that NIST has already done and will continue to do, follow their guidance, and improve your software’s security.

Which one do you think is the best strategy?

I thought so.

Have You Adjusted Your Penetration Testing Strategy for the Cloud?

Hackers are targeting the cloud. Why? To paraphrase Willie Sutton, because that is where the data is.

Historically, penetration testers gain access to network devices through the “perimeter defense” and then they move around (the so-called east-west movement) trying to get access to data, wherever it lives inside the network perimeter.

But in the cloud, there is much more. Not that the traditional method doesn’t work, but it is no longer the only method and if you focus on the traditional methods, you may miss gaps that hackers won’t miss.

Take, for example, the Uber breach that compromised data on 57 million users and 600,000 drivers.

Hackers didn’t break into Uber’s data center.

They didn’t even try to break in through the cloud front door to Uber’s AWS presence.

Instead they stole the password to Uber’s GitHub account and while rifling through Uber’s code, they found a hardcoded AWS S3 password (not exactly best practice, but very common).

From then on, it was game over – they owned the data.

As we saw in the Capital One breach, the problem was not bad code but rather a bad architecture. Certain resources were publicly exposed. Basically on purpose – or at least not well thought out.

Hackers probe these environments for weaknesses and when they find them, they exploit them. Often times they test them before they are even operational and likely before monitoring is turned on.

Many times companies forklift move their systems from a protected corporate data center to the cloud, not understanding that this is a really bad idea.

Another part of the problem is lack of partitioning. When a hacker does compromise credentials, the access he or she gets may be far greater than just one system or one network.

To make things worse, many times the company’s development and test environments are also in the same cloud, protected by the same credentials, but poorly secured because, after all, it is just dev.

Part of it is poor secure software development practices which might be less risky inside a protected corporate data center.

Hackers have figured this out and are having a field day. They will continue to have more success until the pen testing improves.

If you need assistance with this, please contact us.

Credit: Dark Reading

Minimum Viable Secure Product (MVSP)

Vendor risk must be a core part of every company’s cybersecurity program, but it is hard.

Especially when the company is a tech company, developing software that you use.

The term Minimum Viable Product or MVP is a term marketing folks have used for years to describe creating a version 1 product that has the minimum set of features that a customer will be willing to use or buy.

Add another letter and you have another acronym to remember – MVSP – Minimum Viable Secure Product. This is YOU defining what you consider the MINIMUM set of security features that you require in order to buy or use a vendor’s product.

With a little work, this could become a standard.

In part, because this MVSP checklist is based on the checklists already used by two small companies named GOOGLE and DROPBOX.

Rather that having to create your own set of “standards”, one has already been created for you based on what Google and Dropbox require of their vendors.

And it is licensed under the Creative Commons 1.0 license (free for any use).

And it will be updated as needed.

Who should use it?

Proposal teams should use it in RFPs.

Anyone can use it for self assessments.

And vendor management teams can use it as their standard vendor cybersecurity questionnaire.

What is in it?

It contains 4 major sections: Business controls, application design controls. application implementation controls and operational controls.

Section 1 contains eight controls, section 2 contains nine controls, section 3 four controls and section 4 contains three controls.

Alternatively, you can create this yourself. I am sure that you will do a better job than Google and Dropbox.

In fairness, you can tweak it for your own needs.

Credit: Helpnet Security

The MVSP project

The MVSP questionnaire

Businesses Losing Customers due to Connected Products Security Concerns

59% of cybersecurity executives at large and medium organizations say that they have LOST business due to product security concerns for connected and embedded devices.

connected product security concerns

45% say that customers want detailed information about what is in their devices, but only 11% of companies have high confidence that they can do that, even if they want to.

Only 27% of people interviewed said that their organizations conduct software composition analysis (what is in it) and only 30% say that they can easily generate a software bill of materials (as required by the new executive order).

So what does it take to develop secure products? More resources (62%), more expertise (60%), industry standards (46%). Only 21% said that their have a security supply chain policy.

connected product security concerns

On top of this, only half of the respondents said their organization check out the security of their products before they ship them.

The good news is that 74% of the organizations either have a Chief Product Security Officer or plan to hire one. In the next two years.

And, last but not least, only 10% have full confidence that they know all vendors in the supply chain for each of its devices.

Ready to buy one of them secure connected devices now?

Credit: Help Net Security

Be Careful What Contracts You Sign

While the details of this are interesting, what is more important is thinking about all of the contracts that you sign.

This is a legal battle that goes back several years.

In one corner is Fiserv, the Fortune 200 +/- financial services software behemouth.

In the other corner is Bessemer System Federal Credit Union, a small community credit union in Pennsylvania.

In 2018 Brian Krebs reported bugs in Fiserv’s platform that allowed one customer to see another customer’s name, address, bank account number and phone number.

So Bessemer FCU did some more testing and found more bugs – security holes.

According to the credit union, Fiserv responded with an aggressive notice of claims, attempting to silence Bessemer if they discussed these security bugs with third parties, including other Fiserv customers.

In the end Bessemer sued Fiserv and Fiserv counterclaimed.

Fiserv said Bessemer breached its contract, among other things, and wanted attorney fees.

Much of the argument seems to be around the security review, which, if accurate, shows that Fiserv’s software is not secure, something other Fiserv customers might want to know about.

Fiserv says that Bessemer just wants to embarrass Fiserv and get out of paying some bills.

Without spending a lot of time reviewing legal documents, it appears that Bessemer was not happy with Fiserv’s response to being notified about the bugs (like in fixing them, soon) and wants to terminate the contract.

Fiserv, appears to want to silence a critic (boy is that failing) and doesn’t want to let the customer out of its contract.

So what does that mean for you if you sign a contract with a vendor? Here are some thoughts.

  • The vendor is going to want you to sign as long a contract as possible and will usually offer you a price incentive to do so. If this is a new vendor, that is likely not a good deal for you. Shorter might make more sense.
  • You should review the reasons that you can terminate the contract and what that termination will cost you.
  • You should look for any clauses that stop you from talking about the vendor’s product quality. This is different than disclosing secrets. While bugs and security flaws may be secret, they should not be covered by these types of contract restrictions.
  • Vendors should have a fixed amount of time to fix serious bugs or you should be able to terminate your contract.
  • The contract should spell out that the vendor is liable for your losses as a result of security bugs. Software vendors will resist this like the plague, but why should you be responsible for their bad software.

The lawsuit is ongoing. It will be interesting to see how this works out. Given this is now in the news, Fiserv might be smart to try and make it go away. Quietly. A trial could be ugly. On the other hand, Fiserv has a lot more money than Bessemer does.

Stay tuned.

But think about those contracts you signed and how you would fare in a similar situation.

On the other side, if you are a software vendor, how would you handle this situation.

Credit: Security Week