[coreboot] RFC: coding style: "standard" defines

Nico Huber nico.h at gmx.de
Thu Feb 4 23:21:03 CET 2016


On 04.02.2016 10:35, Patrick Georgi via coreboot wrote:
> I think we should seek uniformity here: decide on some style,
> recommend it, clean up the tree to match, and help people stay
> consistent through lint tests.
That's a good idea.

> 2. BIT16 vs BIT(16) vs (1 << 16) vs 0x10000
> I don't think it makes sense to go for a single one of these (0x3ff is
> certainly more readable than BIT11 | BIT10 | BIT9 | BIT8 | BIT7 | BIT8
> | BIT5 | BIT4 | BIT3 | BIT2 | BIT1 | BIT 0), but I doubt we need both
> BIT16 and BIT(16).
I would prefer not to have definitions for single bits. Maybe it's just
the code that I've seen using this pattern, but I don't find it any way
readable. If there are more than three BITS in an expression, I lose
track which are set. So typography really isn't my domain, but I think
the problem is the dominant BIT prefix which makes all those BITS
look the same.

Another argument against BIT* and BIT(x) definitions: It encourages
programmers to split numbers into single bits. Something that I've seen
too often: A datasheet defines some single bits and some numbers in a
word. Let's say bits 30, 15, 8-5 and 1. What we get to review looks
like:
  something |= BIT30 | BIT15 | BIT8 | BIT6 | BIT1; // Welcome to hell!
Instead we could have:
  something |= (1 << 30) | (1 << 15) | (10 << 5) | (1 << 1); // is it
really 10?
(what I can't recall to have ever seen: a mix of BIT* with numbers)

Nico



More information about the coreboot mailing list