I'm new to FPGA designing. I'm working on a project where I'm handling large vectors for a certain application (pattern matching).
Code VHDL - [expand]
1
2
3
4
5
6
7
8
9
Generic(
N :integer:=128;
M :integer:=128;
SAMPLES :integer:=128-- Number of samples );Port( clk :inSTD_LOGIC;
reset :inSTD_LOGIC;
pattern_1 :inSTD_LOGIC_VECTOR(SAMPLES * N -1downto0);
pattern_2 :inSTD_LOGIC_VECTOR(SAMPLES * M -1downto0);
Result :outSTD_LOGIC_VECTOR(SAMPLES * M -1downto0));
I want to see how large the vector size can be accommodated on the FPGA based on the available resources (meaning the N, M, and SAMPLES are not restricted to 128). I know using them in the entity will not be ideal as we cannot map these large vectors to the FPGA IO pins; what will be the best way to handle this? I'm using the Nexys4 FPGA board.
I don’t understand this statement. If you don’t have enough pins to match the number of bits in your vector, then you have to break those vectors into smaller chunks, and load the chunks sequentially and then reassemble them internally.