My designs meets timing without ChipScope. However, when I insert ChipScope into my design, I see that my inverted clock is no longer inverted in the ILOGIC block, but instead by using a LUT. This increases my inverted clock path delay and causes timing violations. How can I fix this problem?
You can also consider using the non-inverted clock as the trigger/data channel for chipscope and sample on the falling edge instead of the rising edge. Refer to Chipscope documentation on how change this setting.
Map is unable to pack the inverter into an ILOGIC if ChipScope Inserter uses the inverter clock as one of its trigger/data channels. ChipScope Inserter adds KEEP and S (Save Net) constraints to the output of the LUT that is used to invert a clock so the net is left intact. Furthermore, these constraints impede Map from optimizing the inverted clock path by using the inverters in the ILOGIC instead of the LUT-based inverter.
If you must use the inverted clock as a trigger/data channel, use ChipScope Generator instead. In your code, instantiate a unique, separate inverter for the inverted clock and connect its output to the ChipScope core. Map will then be able to optimize the inverted clock path that is independent of ChipScope while leaving the separate LUT-based inverter for ChipScope intact.