Asserts considered harmful (or GMP spills its sensitive information)

Jeffrey Walton noloader at gmail.com
Mon Dec 31 19:38:17 UTC 2018


On Mon, Dec 31, 2018 at 2:16 PM Vincent Lefevre <vincent at vinc17.net> wrote:
>
> On 2018-12-31 13:03:27 -0500, Jeffrey Walton wrote:
> > The GMP library uses asserts to crash a program at runtime when
> > presented with data it did not expect. The library also ignores user
> > requests to remove asserts using Posix's -DNDEBUG. Posix asserts are a
> > deugging aide intended for developement, and using them in production
> > software ranges from questionable to insecure.
>
> That's much better than letting the program run erratically, with
> possible memory corruption and/or sensitive information leakage
> to unauthorized users. You'd better fix bugs in your program.

To play devil's advocate for this particular example, GMP could have
validated the parameters and refused to process the data. That is, the
function could have returned failure and avoided the potential
information leak.

> > Many programs can safely use assert to crash a program at runtime.
> > However, the prequisite is, the program cannot handle sensitive
> > information like user passwords, user keys or sensitive documents.
> >
> > High integrity software, like GMP and Nettle, cannot safely use an
> > assert to crash a program. To understand why the data flow must be
> > examined. First, when an assert fires, a SIGABRT is eventually sent to
> > the program on Unix and Linux
> > (http://pubs.opengroup.org/onlinepubs/009695399/functions/assert.html).
> >
> > Second, the SIGABRT terminates the process and can write a core file.
>
> That's the default behavior, but you can trap SIGABRT if you want.
> Of course, there is no guarantee because the memory may already be
> in an inconsistent state.

To play devil's advocate again, that strategy requires every developer
to have the knowledge and implement the sigtrap. On the other hand,
developers are usually pretty good about checking return values at a
call site.

> > This is the first point of unwanted data egress. Sensitive information
> > like user passwords and keys can be written to the filesystem
> > unprotected.
>
> This can occur with any program, even not using asserts, e.g. due to
> a segmentation fault (which may happen as a consequence of not using
> asserts, with possibly worse consequences).
>
> If you don't want a core file, then you can instruct the kernel not
> to write a core file. See getrlimit.

To play devil's advocate again, that strategy requires every user to
have the knowledge. If RTFM was going to worked, It should have
happened in the last 50 years or so.

Refusing to process the data and failing the API call requires no
knowledge on the user's part.

> > Third, the dump is sometimes sent to an error reporting service like
> > Apple Crash Report, Android Crash Report, Ubuntu Apport, and Windows
> > Error Reporting. This is the second point of unwanted data egress.
> > Sensitive information can be sent to the error reporting service. The
> > platform provider like Apple, Google, Microsoft and Ubuntu gain access
> > to the sensitive information, in addition to the developer.
>
> If you don't like them, do not use these services. Not using asserts
> can also yield a crash, which will have the same consequences.

I hope I don't sound too argumentative, but the summary seems to
conflate what's happening. You seem to be arguing all crashes are
outside the programs control. That holds sometimes but not always.

In this instance the library did not validate parameters and return an
error code. Instead it choose to crash. The library was not an
innocent victim of a memory corruption. It was a willing participant
in the data egress. Instigator may be a better term than participant
in this case.

Jeff


More information about the gmp-bugs mailing list