There are two main issues with these constants:
* They don't follow the Go naming convention.
* They call themselves "TWI", while it makes a lot more sense to refer
to the actual name which is I2C (or I²C).
I have not removed them but just deprecated them. Perhaps we can remove
them when we move towards version 1.0.
Do it all at once in preparation for Go 1.18 support.
To make this commit, I've simply modified the `fmt-check` Makefile
target to rewrite files instead of listing the differences. So this is a
fully mechanical change, it should not have introduced any errors.
This can be very useful for some purposes:
* It makes it possible to disable the UART in cases where it is not
needed or needs to be disabled to conserve power.
* It makes it possible to disable the serial output to reduce code
size, which may be important for some chips. Sometimes, a few kB can
be saved this way.
* It makes it possible to override the default, for example you might
want to use an actual UART to debug the USB-CDC implementation.
It also lowers the dependency on having machine.Serial defined, which is
often not defined when targeting a chip. Eventually, we might want to
make it possible to write `-target=nrf52` or `-target=atmega328p` for
example to target the chip itself with no board specific assumptions.
The defaults don't change. I checked this by running `make smoketest`
before and after and comparing the results.
Previously, the machine.UART0 object had two meanings:
- it was the first UART on the chip
- it was the default output for println
These two meanings conflict, and resulted in workarounds like:
- Defining UART0 to refer to the USB-CDC interface (atsamd21,
atsamd51, nrf52840), even though that clearly isn't an UART.
- Defining NRF_UART0 to avoid a conflict with UART0 (which was
redefined as a USB-CDC interface).
- Defining aliases like UART0 = UART1, which refer to the same
hardware peripheral (stm32).
This commit changes this to use a new machine.Serial object for the
default serial port. It might refer to the first or second UART
depending on the board, or even to the USB-CDC interface. Also, UART0
now really refers to the first UART on the chip, no longer to a USB-CDC
interface.
The changes in the runtime package are all just search+replace. The
changes in the machine package are a mixture of search+replace and
manual modifications.
This commit does not affect binary size, in fact it doesn't affect the
resulting binary at all.
This means that machine.UART0, machine.UART1, etc are of type
*machine.UART, not machine.UART. This makes them easier to pass around
and avoids surprises when they are passed around by value while they
should be passed around by reference.
There is a small code size impact in some cases, but it is relatively
minor.
This makes it possible to assign I2C objects (machine.I2C0,
machine.I2C1, etc.) without needing to take a pointer.
This is important especially in the future when I2C may be driven using
DMA and the machine.I2C type needs to store some state.
These stubs don't really belong there: attiny currently doesn't directly
support I2C at all (although it has hardware to support a software
implementation).
This commit changes pin numbering for atmega328 based boards (Uno, Nano)
to use the standard format, where pin number is determined by the
pin/port. Previously, pin numbers were based on what the Uno uses, which
does not seem to have a clear pattern.
One difference is that counting starts at port B, as there is no port A.
So PB0 is 0, PB1 is 1… PC0 is 8.
This commit also moves PWM code to the atmega328 file, as it may not be
generic to all ATmega chips.
Not tested on actual hardware, only on simavr. The main motivation for
adding this chip is to be able to run simulated tests using a much
larger memory space (16kB RAM, 128kB flash) without jumping to the XMega
devices that may not be as well supported by LLVM.
This commit lets the compiler know about interrupts and allows
optimizations to be performed based on that: interrupts are eliminated
when they appear to be unused in a program. This is done with a new
pseudo-call (runtime/interrupt.New) that is treated specially by the
compiler.
These all-caps constants aren't in the Go style, so rename it to
CPUFrequency (which is more aligned with Go style). Additionally, make
it a function so that it is possible to add support for changing the
frequency in the future.
Tested by running `make smoketest`. None of the outputs did change.
* machine/uart: add core support for multiple UARTs by allowing for multiple RingBuffers
* machine/uart: complete core support for multiple UARTs
* machine/uart: no need to store pointer to UART, better to treat like I2C and SPI
* machine/uart: increase ring buffer size to 128 bytes
* machine/uart: improve godocs comments and use comma-ok idiom for buffer Put/Get methods