How do I use 16-bit input, logic and output?
Posted: Mon May 10, 2021 10:26 am
Hi everyone -
I'm new to using Logic Circuit, and loving it so far. I've only run through some basics, like constructing basic logic chips from NANDs. At a single bit input level, everything is working fine.
I''ve now moved along to wanting to make some 16-bit variants and I've got myself in a tangle so hoping someone can explain some basics to me please.
I was just going to test things by having:
1. An input, set to 16 bits wide
2. A NAND, where I selected 16 from the drop-down next to the out-of-the-box gate
3. An output, again set to 16 bits
However, when I try to pull a truth table, I get a message saying the bit width of input and NAND don't match.
I'm wondering if what I'm trying to do is a bit of a nonsense approach, and whether I need to do something else (like make my own circuit with 16 inputs connected to 16 NANDs to 16 outputs... which doesn't sound quite right either.
All advice welcome, and thanks in advance!
I'm new to using Logic Circuit, and loving it so far. I've only run through some basics, like constructing basic logic chips from NANDs. At a single bit input level, everything is working fine.
I''ve now moved along to wanting to make some 16-bit variants and I've got myself in a tangle so hoping someone can explain some basics to me please.
I was just going to test things by having:
1. An input, set to 16 bits wide
2. A NAND, where I selected 16 from the drop-down next to the out-of-the-box gate
3. An output, again set to 16 bits
However, when I try to pull a truth table, I get a message saying the bit width of input and NAND don't match.
I'm wondering if what I'm trying to do is a bit of a nonsense approach, and whether I need to do something else (like make my own circuit with 16 inputs connected to 16 NANDs to 16 outputs... which doesn't sound quite right either.
All advice welcome, and thanks in advance!