I'm assuming you're using the DVI-A mode based on your use of HSYNC, VSYNC, and DE?
You need to generate a pixel clock that matches the pixel rate of the incoming video. You could do this by oversampling the input and detecting bit changes and determine the center of the data eye. Altera does something similar with their ASI core. In that case they are trying to extract bits from a 270 MHz serial data stream, so they 5x oversample using both rising and falling edges of a 337.5 MHz clock (using 4 phases of the clock 0, 90, 180, 270), which just barely meets timing in a Cyclone III. Fortunately for you DVI-A the pixel clock maximum is something like 165 MHz, so you can get away with using something like 206.25 MHz to oversampling by 5 or 247.5 MHz to get oversampling by 6.