The MIPI CSI-2 RX Subsystem for 7 Series contains a Calibration Mode setting that can be set to None, Fixed or Auto.
The Calibration mode=auto setting is intended for the MIPI CSI-2 RX Subsystem to do Clock/Data skew adjustment, by finding the optimal setting for IDELAY fine-resolution delay tap value automatically.
After modifying my calibration mode from "None" to "Auto", my FPGA implementation has failed.
What is causing this failed implementation?
The MIPI CSI-2 RX Subsystem does not require a specific I/O standard. Users targeting 7 Series devices should follow the I/O standard recommended by XAPP894 (i.e. LVDS_25 used for High Speed pins, HSUL_12 used for LP pins).
According to the 7 Series datasheet, input pins of LVDS_25 and HSUL_12 standard can be placed in a bank along with other pins with different Vcco level requirement I/O standards if the DIFF_TERM=FALSE setting is used.
By default, these pins have the DIFF_TERM=FALSE setting, so you can place pins with different I/O standards (i.e. LVCMOS_18 to control sensor I2C ) within the same bank of MIPI CSI-2 RX IP input.
However, if you change the MIPI CSI-2 RX Subsystem calibration mode to Auto, the MIPI CSI-2 RX Subsystem will change the input High Speed clock/data pin setting to DIFF_TERM=TRUE.
This will require the High Speed input pins to be placed in banks with VCCO=2.5V.
You will need to be aware of this change to the I/O configuration when designing your system.