forked from OSchip/llvm-project
Fix buffer underrun (invalid read) triggered during diagnostic rendering. The test would overflow when computing '0 - 1'.
I don't have a good testcase for this that does not depend on system headers. It did not trigger with preprocessed output, and I had trouble reducing the example. Fixes <rdar://problem/13324594>. Thanks to Michael Greiner for reporting this issue. llvm-svn: 177201
This commit is contained in:
parent
e7b849e4fb
commit
90d7fa12d0
|
@ -958,7 +958,7 @@ static void highlightRange(const CharSourceRange &R,
|
|||
// Pick the last non-whitespace column.
|
||||
if (EndColNo > map.getSourceLine().size())
|
||||
EndColNo = map.getSourceLine().size();
|
||||
while (EndColNo-1 &&
|
||||
while (EndColNo &&
|
||||
(map.getSourceLine()[EndColNo-1] == ' ' ||
|
||||
map.getSourceLine()[EndColNo-1] == '\t'))
|
||||
EndColNo = map.startOfPreviousColumn(EndColNo);
|
||||
|
|
Loading…
Reference in New Issue