forked from OSchip/llvm-project
43ee367d1e
The values for the char/short/integer/long minimums were declared with their actual values, not the definitions from the CL spec (v1.1). As a result, (-2147483648) was actually being treated as a long by the compiler, not an int, which caused issues when trying to add/subtract that value from a vector. Update the definitions to use the values declared by the spec, and also add explicit casts for the char/short/int minimums so that the compiler actually treats them as shorts/chars. Without those casts, they actually end up stored as integers, and the compiler may end up storing the INT_MIN as a long. The compiler can sign extend the values if it needs to convert the char->short, short->int, or int->long v2: Add explicit cast for INT_MIN and fix some type-o's and wrapping in the commit message. Reported-by: Moritz Pflanzer <moritz.pflanzer14@imperial.ac.uk> CC: Moritz Pflanzer <moritz.pflanzer14@imperial.ac.uk> Reviewed-by: Tom Stellard <thomas.stellard@amd.com> Signed-off-by: Aaron Watry <awatry@gmail.com> llvm-svn: 247661 |
||
---|---|---|
.. | ||
clc | ||
math | ||
config.h |