When looking at the actual bitstream sent to a device, I find that the length count does not match with what it theoretically should be.
What is the real Length Count equation?
For LC_Alignment = Length:
The Length Count counts the number of bits starting from the moment INIT is released (that is, including the initial fill byte, actual 24-bit length count, and everything until the end of the bitstream). Then, subtract seven bits from this total.
For example:
Spartan-XL: xcs05xl,
Length Count = <total bits including final postamble (01111)>
+ <startup bits in order to get to multiple of 8>
- seven (7)
=> LC = 54528 + 8 - 7 = 54536 - 7 = 54529
The number 54536 can be easily referred from the specific device's Program Data Table in its data sheet. Go to the row for PROM Size (bits). Also, see the footnotes for that table which describe the details.
AR# 7912 | |
---|---|
Date | 05/14/2014 |
Status | Archive |
Type | General Article |