This is too broad a question. There are too many factors:
Do we actually know what algorithms to use? what is the application? do you already have IP in house? what are the interfaces? What are the engineer skills? What are the timing challenges?
It can really vary from project to project.
For example, I recently worked on a project that had most of the HDL already coded, but some upgrades were needed. With 3 engineers over 4 months my entire effort was put into place and route, as the timing and floorplanning were a massive problem. It was using a clock speed close to the limits of the device with 70%+ resources used and 90%+ of ram. But on other projects Ive spent months in 1 and 2 with just days in 3 (no resource problems).
So each project is different.
Thanks for the response.
So if I understand the scenario you described - you have exactly 100% IP reuse, 0% new design. The effort for the project was
Preliminary Design: 0 days, 0%
Detailed Design: 0 days, 0% of the total
Implementation: 4 months effort, 100% of the total
Now consider that exact same project, except that you planned to built it from scratch. 0% IP reuse, 100% new design. Assume that you expect the exact same amount of timing issues (perhaps not realistic, but lets hold it constant for discussion purposes). Exact same application, developers equally skilled (not the exact same skills - but both considered at the exact same "level of skill" i.e. both experts, or both novices, for the type of work they will be doing). exact same interfaces. Everything that affects the effort needed is held constant except the amount of IP reuse and new design. What would the new breakout be? Here's my guess:
Preliminary Design: 3 months effort, 27% of the total
Detailed Design: 4 months effort, 36%
Implementation: 4 months effort, 36%
Then, assume 50% IP reuse, 50% new design. My guess:
Preliminary Design: 2 months effort, 22%
Detailed Design: 3 months effort, 33% of the total
Implementation: 4 months effort, 45% of the total
- - - Updated - - -
Basically, my question is:
Let's assume you sat down at a computer, opened a "cost model" that supposedly estimates FPGA development costs, and described your project. You told the model the application, how much timing issues you expect to encounter, your interface complexity, developer skill levels, where your IP comes from, predicted size (logic elements or gates or HDL lines of code or whatever makes the most sense to you). You also say it will be 100% new design (aka 0% reuse). You click RUN, and it gives you some numbers.
How would you expect those numbers to change if you reran the model, but with 50% new design? how about 0%? Everything else remains unchanged. If it depends on specific scenarios, perhaps you could pick 1 common scenario and describe it, then explain why it would be wrong in another scenario so I can see your thought process.
I truly appreciate your help, i'd be screwed without folks way smarter/knowledgeable than me helping to figure this out. You're a lifesaver