I am trying to design a generic decoder with an input of N-bits:
library ieee; use ieee.std_logic_1164.all; -- standard unresolved logic UX01ZWLH- use ieee.numeric_std.all; -- for the signed, unsigned types and arithmetic operations entity decoder is generic ( input_width : positive := 2 ); port ( input : in std_logic_vector(input_width - 1 downto 0); output : out std_logic_vector(2**input_width - 1 downto 0) ); end entity decoder; architecture behavioral of decoder is begin output <= (to_integer(unsigned(input)) => '1', others => '0'); end architecture behavioral;
I am wondering what will happen if some bits in the input are "U" or "X" or "Z" for example: what will unsigned(input) return?
Is it possible to set my output to 0 when this case occurs?
More importantly, how are "U", "X", "Z", etc handled in synthesis? What will happen if we synthetize this decoder and supply him for example with a "Z" input (leaving the input in open circuit)?