[lldb][NFC] Fix all formatting errors in .cpp file headers
Summary:
A *.cpp file header in LLDB (and in LLDB) should like this:
```
//===-- TestUtilities.cpp -------------------------------------------------===//
```
However in LLDB most of our source files have arbitrary changes to this format and
these changes are spreading through LLDB as folks usually just use the existing
source files as templates for their new files (most notably the unnecessary
editor language indicator `-*- C++ -*-` is spreading and in every review
someone is pointing out that this is wrong, resulting in people pointing out that this
is done in the same way in other files).
This patch removes most of these inconsistencies including the editor language indicators,
all the different missing/additional '-' characters, files that center the file name, missing
trailing `===//` (mostly caused by clang-format breaking the line).
Reviewers: aprantl, espindola, jfb, shafik, JDevlieghere
Reviewed By: JDevlieghere
Subscribers: dexonsmith, wuzish, emaste, sdardis, nemanjai, kbarton, MaskRay, atanasyan, arphaman, jfb, abidh, jsji, JDevlieghere, usaxena95, lldb-commits
Tags: #lldb
Differential Revision: https://reviews.llvm.org/D73258
2020-01-24 15:23:27 +08:00
|
|
|
//===-- Options.cpp -------------------------------------------------------===//
|
2010-06-09 00:52:24 +08:00
|
|
|
//
|
2019-01-19 16:50:56 +08:00
|
|
|
// Part of the LLVM Project, under the Apache License v2.0 with LLVM Exceptions.
|
|
|
|
// See https://llvm.org/LICENSE.txt for license information.
|
|
|
|
// SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
|
2010-06-09 00:52:24 +08:00
|
|
|
//
|
|
|
|
//===----------------------------------------------------------------------===//
|
|
|
|
|
2010-06-16 03:49:27 +08:00
|
|
|
#include "lldb/Interpreter/Options.h"
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2010-09-10 00:44:14 +08:00
|
|
|
#include <algorithm>
|
2011-04-14 06:47:15 +08:00
|
|
|
#include <bitset>
|
2012-12-04 08:32:51 +08:00
|
|
|
#include <map>
|
2016-03-15 06:17:04 +08:00
|
|
|
#include <set>
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2017-03-23 07:33:16 +08:00
|
|
|
#include "lldb/Host/OptionParser.h"
|
2010-06-09 00:52:24 +08:00
|
|
|
#include "lldb/Interpreter/CommandCompletions.h"
|
|
|
|
#include "lldb/Interpreter/CommandInterpreter.h"
|
|
|
|
#include "lldb/Interpreter/CommandObject.h"
|
|
|
|
#include "lldb/Interpreter/CommandReturnObject.h"
|
|
|
|
#include "lldb/Target/Target.h"
|
2017-02-03 05:39:50 +08:00
|
|
|
#include "lldb/Utility/StreamString.h"
|
2022-04-08 23:43:16 +08:00
|
|
|
#include "llvm/ADT/STLExtras.h"
|
2010-06-09 00:52:24 +08:00
|
|
|
|
|
|
|
using namespace lldb;
|
|
|
|
using namespace lldb_private;
|
|
|
|
|
|
|
|
// Options
|
2021-02-20 04:42:42 +08:00
|
|
|
Options::Options() { BuildValidOptionSets(); }
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2021-02-18 17:32:22 +08:00
|
|
|
Options::~Options() = default;
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2016-08-12 07:51:28 +08:00
|
|
|
void Options::NotifyOptionParsingStarting(ExecutionContext *execution_context) {
|
2010-06-09 00:52:24 +08:00
|
|
|
m_seen_options.clear();
|
Many improvements to the Platform base class and subclasses. The base Platform
class now implements the Host functionality for a lot of things that make
sense by default so that subclasses can check:
int
PlatformSubclass::Foo ()
{
if (IsHost())
return Platform::Foo (); // Let the platform base class do the host specific stuff
// Platform subclass specific code...
int result = ...
return result;
}
Added new functions to the platform:
virtual const char *Platform::GetUserName (uint32_t uid);
virtual const char *Platform::GetGroupName (uint32_t gid);
The user and group names are cached locally so that remote platforms can avoid
sending packets multiple times to resolve this information.
Added the parent process ID to the ProcessInfo class.
Added a new ProcessInfoMatch class which helps us to match processes up
and changed the Host layer over to using this new class. The new class allows
us to search for processs:
1 - by name (equal to, starts with, ends with, contains, and regex)
2 - by pid
3 - And further check for parent pid == value, uid == value, gid == value,
euid == value, egid == value, arch == value, parent == value.
This is all hookup up to the "platform process list" command which required
adding dumping routines to dump process information. If the Host class
implements the process lookup routines, you can now lists processes on
your local machine:
machine1.foo.com % lldb
(lldb) platform process list
PID PARENT USER GROUP EFF USER EFF GROUP TRIPLE NAME
====== ====== ========== ========== ========== ========== ======================== ============================
99538 1 username usergroup username usergroup x86_64-apple-darwin FileMerge
94943 1 username usergroup username usergroup x86_64-apple-darwin mdworker
94852 244 username usergroup username usergroup x86_64-apple-darwin Safari
94727 244 username usergroup username usergroup x86_64-apple-darwin Xcode
92742 92710 username usergroup username usergroup i386-apple-darwin debugserver
This of course also works remotely with the lldb-platform:
machine1.foo.com % lldb-platform --listen 1234
machine2.foo.com % lldb
(lldb) platform create remote-macosx
Platform: remote-macosx
Connected: no
(lldb) platform connect connect://localhost:1444
Platform: remote-macosx
Triple: x86_64-apple-darwin
OS Version: 10.6.7 (10J869)
Kernel: Darwin Kernel Version 10.7.0: Sat Jan 29 15:17:16 PST 2011; root:xnu-1504.9.37~1/RELEASE_I386
Hostname: machine1.foo.com
Connected: yes
(lldb) platform process list
PID PARENT USER GROUP EFF USER EFF GROUP TRIPLE NAME
====== ====== ========== ========== ========== ========== ======================== ============================
99556 244 username usergroup username usergroup x86_64-apple-darwin trustevaluation
99548 65539 username usergroup username usergroup x86_64-apple-darwin lldb
99538 1 username usergroup username usergroup x86_64-apple-darwin FileMerge
94943 1 username usergroup username usergroup x86_64-apple-darwin mdworker
94852 244 username usergroup username usergroup x86_64-apple-darwin Safari
The lldb-platform implements everything with the Host:: layer, so this should
"just work" for linux. I will probably be adding more stuff to the Host layer
for launching processes and attaching to processes so that this support should
eventually just work as well.
Modified the target to be able to be created with an architecture that differs
from the main executable. This is needed for iOS debugging since we can have
an "armv6" binary which can run on an "armv7" machine, so we want to be able
to do:
% lldb
(lldb) platform create remote-ios
(lldb) file --arch armv7 a.out
Where "a.out" is an armv6 executable. The platform then can correctly decide
to open all "armv7" images for all dependent shared libraries.
Modified the disassembly to show the current PC value. Example output:
(lldb) disassemble --frame
a.out`main:
0x1eb7: pushl %ebp
0x1eb8: movl %esp, %ebp
0x1eba: pushl %ebx
0x1ebb: subl $20, %esp
0x1ebe: calll 0x1ec3 ; main + 12 at test.c:18
0x1ec3: popl %ebx
-> 0x1ec4: calll 0x1f12 ; getpid
0x1ec9: movl %eax, 4(%esp)
0x1ecd: leal 199(%ebx), %eax
0x1ed3: movl %eax, (%esp)
0x1ed6: calll 0x1f18 ; printf
0x1edb: leal 213(%ebx), %eax
0x1ee1: movl %eax, (%esp)
0x1ee4: calll 0x1f1e ; puts
0x1ee9: calll 0x1f0c ; getchar
0x1eee: movl $20, (%esp)
0x1ef5: calll 0x1e6a ; sleep_loop at test.c:6
0x1efa: movl $12, %eax
0x1eff: addl $20, %esp
0x1f02: popl %ebx
0x1f03: leave
0x1f04: ret
This can be handy when dealing with the new --line options that was recently
added:
(lldb) disassemble --line
a.out`main + 13 at test.c:19
18 {
-> 19 printf("Process: %i\n\n", getpid());
20 puts("Press any key to continue..."); getchar();
-> 0x1ec4: calll 0x1f12 ; getpid
0x1ec9: movl %eax, 4(%esp)
0x1ecd: leal 199(%ebx), %eax
0x1ed3: movl %eax, (%esp)
0x1ed6: calll 0x1f18 ; printf
Modified the ModuleList to have a lookup based solely on a UUID. Since the
UUID is typically the MD5 checksum of a binary image, there is no need
to give the path and architecture when searching for a pre-existing
image in an image list.
Now that we support remote debugging a bit better, our lldb_private::Module
needs to be able to track what the original path for file was as the platform
knows it, as well as where the file is locally. The module has the two
following functions to retrieve both paths:
const FileSpec &Module::GetFileSpec () const;
const FileSpec &Module::GetPlatformFileSpec () const;
llvm-svn: 128563
2011-03-31 02:16:51 +08:00
|
|
|
// Let the subclass reset its option values
|
2016-08-12 07:51:28 +08:00
|
|
|
OptionParsingStarting(execution_context);
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
|
|
|
|
2017-05-12 12:51:55 +08:00
|
|
|
Status
|
|
|
|
Options::NotifyOptionParsingFinished(ExecutionContext *execution_context) {
|
2016-08-12 07:51:28 +08:00
|
|
|
return OptionParsingFinished(execution_context);
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
void Options::OptionSeen(int option_idx) { m_seen_options.insert(option_idx); }
|
|
|
|
|
|
|
|
// Returns true is set_a is a subset of set_b; Otherwise returns false.
|
|
|
|
|
|
|
|
bool Options::IsASubset(const OptionSet &set_a, const OptionSet &set_b) {
|
|
|
|
bool is_a_subset = true;
|
|
|
|
OptionSet::const_iterator pos_a;
|
|
|
|
OptionSet::const_iterator pos_b;
|
|
|
|
|
|
|
|
// set_a is a subset of set_b if every member of set_a is also a member of
|
|
|
|
// set_b
|
|
|
|
|
|
|
|
for (pos_a = set_a.begin(); pos_a != set_a.end() && is_a_subset; ++pos_a) {
|
|
|
|
pos_b = set_b.find(*pos_a);
|
|
|
|
if (pos_b == set_b.end())
|
|
|
|
is_a_subset = false;
|
|
|
|
}
|
|
|
|
|
|
|
|
return is_a_subset;
|
|
|
|
}
|
|
|
|
|
|
|
|
// Returns the set difference set_a - set_b, i.e. { x | ElementOf (x, set_a) &&
|
|
|
|
// !ElementOf (x, set_b) }
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
size_t Options::OptionsSetDiff(const OptionSet &set_a, const OptionSet &set_b,
|
|
|
|
OptionSet &diffs) {
|
|
|
|
size_t num_diffs = 0;
|
|
|
|
OptionSet::const_iterator pos_a;
|
|
|
|
OptionSet::const_iterator pos_b;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
for (pos_a = set_a.begin(); pos_a != set_a.end(); ++pos_a) {
|
|
|
|
pos_b = set_b.find(*pos_a);
|
|
|
|
if (pos_b == set_b.end()) {
|
|
|
|
++num_diffs;
|
|
|
|
diffs.insert(*pos_a);
|
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
|
|
|
return num_diffs;
|
|
|
|
}
|
|
|
|
|
|
|
|
// Returns the union of set_a and set_b. Does not put duplicate members into
|
|
|
|
// the union.
|
|
|
|
|
|
|
|
void Options::OptionsSetUnion(const OptionSet &set_a, const OptionSet &set_b,
|
|
|
|
OptionSet &union_set) {
|
|
|
|
OptionSet::const_iterator pos;
|
|
|
|
OptionSet::iterator pos_union;
|
|
|
|
|
|
|
|
// Put all the elements of set_a into the union.
|
|
|
|
|
|
|
|
for (pos = set_a.begin(); pos != set_a.end(); ++pos)
|
|
|
|
union_set.insert(*pos);
|
|
|
|
|
|
|
|
// Put all the elements of set_b that are not already there into the union.
|
|
|
|
for (pos = set_b.begin(); pos != set_b.end(); ++pos) {
|
|
|
|
pos_union = union_set.find(*pos);
|
|
|
|
if (pos_union == union_set.end())
|
|
|
|
union_set.insert(*pos);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
bool Options::VerifyOptions(CommandReturnObject &result) {
|
|
|
|
bool options_are_valid = false;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-25 04:30:15 +08:00
|
|
|
int num_levels = GetRequiredOptions().size();
|
2010-06-09 00:52:24 +08:00
|
|
|
if (num_levels) {
|
|
|
|
for (int i = 0; i < num_levels && !options_are_valid; ++i) {
|
2018-05-01 00:49:04 +08:00
|
|
|
// This is the correct set of options if: 1). m_seen_options contains
|
|
|
|
// all of m_required_options[i] (i.e. all the required options at this
|
|
|
|
// level are a subset of m_seen_options); AND 2). { m_seen_options -
|
|
|
|
// m_required_options[i] is a subset of m_options_options[i] (i.e. all
|
|
|
|
// the rest of m_seen_options are in the set of optional options at this
|
|
|
|
// level.
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
// Check to see if all of m_required_options[i] are a subset of
|
|
|
|
// m_seen_options
|
2010-06-25 04:30:15 +08:00
|
|
|
if (IsASubset(GetRequiredOptions()[i], m_seen_options)) {
|
2010-06-09 00:52:24 +08:00
|
|
|
// Construct the set difference: remaining_options = {m_seen_options} -
|
|
|
|
// {m_required_options[i]}
|
|
|
|
OptionSet remaining_options;
|
2010-06-25 04:30:15 +08:00
|
|
|
OptionsSetDiff(m_seen_options, GetRequiredOptions()[i],
|
2010-06-09 00:52:24 +08:00
|
|
|
remaining_options);
|
|
|
|
// Check to see if remaining_options is a subset of
|
|
|
|
// m_optional_options[i]
|
|
|
|
if (IsASubset(remaining_options, GetOptionalOptions()[i]))
|
|
|
|
options_are_valid = true;
|
|
|
|
}
|
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
} else {
|
2010-06-09 00:52:24 +08:00
|
|
|
options_are_valid = true;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
if (options_are_valid) {
|
|
|
|
result.SetStatus(eReturnStatusSuccessFinishNoResult);
|
2016-09-07 04:57:50 +08:00
|
|
|
} else {
|
2010-06-09 00:52:24 +08:00
|
|
|
result.AppendError("invalid combination of options for the given command");
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
return options_are_valid;
|
|
|
|
}
|
|
|
|
|
2010-06-16 02:47:14 +08:00
|
|
|
// This is called in the Options constructor, though we could call it lazily if
|
2018-05-01 00:49:04 +08:00
|
|
|
// that ends up being a performance problem.
|
2010-06-16 02:47:14 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
void Options::BuildValidOptionSets() {
|
|
|
|
// Check to see if we already did this.
|
|
|
|
if (m_required_options.size() != 0)
|
|
|
|
return;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
// Check to see if there are any options.
|
|
|
|
int num_options = NumCommandOptions();
|
|
|
|
if (num_options == 0)
|
|
|
|
return;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
auto opt_defs = GetDefinitions();
|
2010-06-09 00:52:24 +08:00
|
|
|
m_required_options.resize(1);
|
|
|
|
m_optional_options.resize(1);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-16 02:47:14 +08:00
|
|
|
// First count the number of option sets we've got. Ignore
|
|
|
|
// LLDB_ALL_OPTION_SETS...
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-16 02:47:14 +08:00
|
|
|
uint32_t num_option_sets = 0;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
for (const auto &def : opt_defs) {
|
|
|
|
uint32_t this_usage_mask = def.usage_mask;
|
2010-06-16 02:47:14 +08:00
|
|
|
if (this_usage_mask == LLDB_OPT_SET_ALL) {
|
|
|
|
if (num_option_sets == 0)
|
|
|
|
num_option_sets = 1;
|
2010-06-09 00:52:24 +08:00
|
|
|
} else {
|
2013-06-20 03:04:53 +08:00
|
|
|
for (uint32_t j = 0; j < LLDB_MAX_NUM_OPTION_SETS; j++) {
|
2010-06-25 04:30:15 +08:00
|
|
|
if (this_usage_mask & (1 << j)) {
|
2010-06-16 02:47:14 +08:00
|
|
|
if (num_option_sets <= j)
|
|
|
|
num_option_sets = j + 1;
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-16 02:47:14 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2010-06-16 02:47:14 +08:00
|
|
|
if (num_option_sets > 0) {
|
|
|
|
m_required_options.resize(num_option_sets);
|
|
|
|
m_optional_options.resize(num_option_sets);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
for (const auto &def : opt_defs) {
|
2013-06-20 03:04:53 +08:00
|
|
|
for (uint32_t j = 0; j < num_option_sets; j++) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
if (def.usage_mask & 1 << j) {
|
|
|
|
if (def.required)
|
|
|
|
m_required_options[j].insert(def.short_option);
|
2010-06-16 02:47:14 +08:00
|
|
|
else
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
m_optional_options[j].insert(def.short_option);
|
2010-06-16 02:47:14 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
uint32_t Options::NumCommandOptions() { return GetDefinitions().size(); }
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2014-07-10 00:31:49 +08:00
|
|
|
Option *Options::GetLongOptions() {
|
2010-06-09 00:52:24 +08:00
|
|
|
// Check to see if this has already been done.
|
2014-07-10 00:31:49 +08:00
|
|
|
if (m_getopt_table.empty()) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
auto defs = GetDefinitions();
|
|
|
|
if (defs.empty())
|
2014-07-10 00:31:49 +08:00
|
|
|
return nullptr;
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2010-07-21 06:52:08 +08:00
|
|
|
std::map<int, uint32_t> option_seen;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
m_getopt_table.resize(defs.size() + 1);
|
|
|
|
for (size_t i = 0; i < defs.size(); ++i) {
|
|
|
|
const int short_opt = defs[i].short_option;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
m_getopt_table[i].definition = &defs[i];
|
2014-04-20 08:31:37 +08:00
|
|
|
m_getopt_table[i].flag = nullptr;
|
2012-12-04 08:32:51 +08:00
|
|
|
m_getopt_table[i].val = short_opt;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-07-21 06:52:08 +08:00
|
|
|
if (option_seen.find(short_opt) == option_seen.end()) {
|
2012-12-04 08:32:51 +08:00
|
|
|
option_seen[short_opt] = i;
|
|
|
|
} else if (short_opt) {
|
|
|
|
m_getopt_table[i].val = 0;
|
|
|
|
std::map<int, uint32_t>::const_iterator pos =
|
|
|
|
option_seen.find(short_opt);
|
|
|
|
StreamString strm;
|
2020-11-10 20:17:07 +08:00
|
|
|
if (defs[i].HasShortOption())
|
2010-07-21 06:52:08 +08:00
|
|
|
Host::SystemLog(Host::eSystemLogError,
|
2012-12-04 08:32:51 +08:00
|
|
|
"option[%u] --%s has a short option -%c that "
|
|
|
|
"conflicts with option[%u] --%s, short option won't "
|
|
|
|
"be used for --%s\n",
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
(int)i, defs[i].long_option, short_opt, pos->second,
|
2014-07-10 00:31:49 +08:00
|
|
|
m_getopt_table[pos->second].definition->long_option,
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
defs[i].long_option);
|
2016-09-07 04:57:50 +08:00
|
|
|
else
|
2012-12-04 08:32:51 +08:00
|
|
|
Host::SystemLog(Host::eSystemLogError,
|
|
|
|
"option[%u] --%s has a short option 0x%x that "
|
|
|
|
"conflicts with option[%u] --%s, short option won't "
|
|
|
|
"be used for --%s\n",
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
(int)i, defs[i].long_option, short_opt, pos->second,
|
2014-07-10 00:31:49 +08:00
|
|
|
m_getopt_table[pos->second].definition->long_option,
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
defs[i].long_option);
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2013-04-05 04:35:24 +08:00
|
|
|
// getopt_long_only requires a NULL final entry in the table:
|
2010-06-09 00:52:24 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
m_getopt_table.back().definition = nullptr;
|
|
|
|
m_getopt_table.back().flag = nullptr;
|
|
|
|
m_getopt_table.back().val = 0;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
|
|
|
if (m_getopt_table.empty())
|
2014-04-20 08:31:37 +08:00
|
|
|
return nullptr;
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2014-07-10 00:31:49 +08:00
|
|
|
return &m_getopt_table.front();
|
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
|
|
|
// This function takes INDENT, which tells how many spaces to output at the
|
2018-05-01 00:49:04 +08:00
|
|
|
// front of each line; SPACES, which is a string containing 80 spaces; and
|
|
|
|
// TEXT, which is the text that is to be output. It outputs the text, on
|
2010-06-09 00:52:24 +08:00
|
|
|
// multiple lines if necessary, to RESULT, with INDENT spaces at the front of
|
2018-05-01 00:49:04 +08:00
|
|
|
// each line. It breaks lines on spaces, tabs or newlines, shortening the line
|
|
|
|
// if necessary to not break in the middle of a word. It assumes that each
|
2010-06-09 00:52:24 +08:00
|
|
|
// output line should contain a maximum of OUTPUT_MAX_COLUMNS characters.
|
|
|
|
|
2014-07-10 00:31:49 +08:00
|
|
|
void Options::OutputFormattedUsageText(Stream &strm,
|
|
|
|
const OptionDefinition &option_def,
|
|
|
|
uint32_t output_max_columns) {
|
|
|
|
std::string actual_text;
|
|
|
|
if (option_def.validator) {
|
|
|
|
const char *condition = option_def.validator->ShortConditionString();
|
|
|
|
if (condition) {
|
|
|
|
actual_text = "[";
|
2010-06-09 00:52:24 +08:00
|
|
|
actual_text.append(condition);
|
2014-07-10 00:31:49 +08:00
|
|
|
actual_text.append("] ");
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2014-07-10 00:31:49 +08:00
|
|
|
actual_text.append(option_def.usage_text);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
// Will it all fit on one line?
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2014-07-10 00:31:49 +08:00
|
|
|
if (static_cast<uint32_t>(actual_text.length() + strm.GetIndentLevel()) <
|
2010-06-09 00:52:24 +08:00
|
|
|
output_max_columns) {
|
|
|
|
// Output it as a single line.
|
2020-02-11 20:31:00 +08:00
|
|
|
strm.Indent(actual_text);
|
2010-06-09 00:52:24 +08:00
|
|
|
strm.EOL();
|
|
|
|
} else {
|
|
|
|
// We need to break it up into multiple lines.
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
int text_width = output_max_columns - strm.GetIndentLevel() - 1;
|
|
|
|
int start = 0;
|
|
|
|
int end = start;
|
2014-07-10 00:31:49 +08:00
|
|
|
int final_end = actual_text.length();
|
2010-06-09 00:52:24 +08:00
|
|
|
int sub_len;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
while (end < final_end) {
|
|
|
|
// Don't start the 'text' on a space, since we're already outputting the
|
|
|
|
// indentation.
|
2014-07-10 00:31:49 +08:00
|
|
|
while ((start < final_end) && (actual_text[start] == ' '))
|
2010-06-09 00:52:24 +08:00
|
|
|
start++;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
end = start + text_width;
|
|
|
|
if (end > final_end)
|
|
|
|
end = final_end;
|
|
|
|
else {
|
|
|
|
// If we're not at the end of the text, make sure we break the line on
|
|
|
|
// white space.
|
2014-07-10 00:31:49 +08:00
|
|
|
while (end > start && actual_text[end] != ' ' &&
|
|
|
|
actual_text[end] != '\t' && actual_text[end] != '\n')
|
2010-06-09 00:52:24 +08:00
|
|
|
end--;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub_len = end - start;
|
|
|
|
if (start != 0)
|
|
|
|
strm.EOL();
|
|
|
|
strm.Indent();
|
|
|
|
assert(start < final_end);
|
|
|
|
assert(start + sub_len <= final_end);
|
2014-07-10 00:31:49 +08:00
|
|
|
strm.Write(actual_text.c_str() + start, sub_len);
|
2010-06-09 00:52:24 +08:00
|
|
|
start = end + 1;
|
|
|
|
}
|
|
|
|
strm.EOL();
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
|
|
|
|
2011-10-29 08:57:28 +08:00
|
|
|
bool Options::SupportsLongOption(const char *long_option) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
if (!long_option || !long_option[0])
|
|
|
|
return false;
|
|
|
|
|
|
|
|
auto opt_defs = GetDefinitions();
|
|
|
|
if (opt_defs.empty())
|
|
|
|
return false;
|
|
|
|
|
|
|
|
const char *long_option_name = long_option;
|
|
|
|
if (long_option[0] == '-' && long_option[1] == '-')
|
|
|
|
long_option_name += 2;
|
|
|
|
|
|
|
|
for (auto &def : opt_defs) {
|
|
|
|
if (!def.long_option)
|
|
|
|
continue;
|
|
|
|
|
|
|
|
if (strcmp(def.long_option, long_option_name) == 0)
|
|
|
|
return true;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
|
2011-10-29 08:57:28 +08:00
|
|
|
return false;
|
|
|
|
}
|
|
|
|
|
2012-12-04 08:32:51 +08:00
|
|
|
enum OptionDisplayType {
|
|
|
|
eDisplayBestOption,
|
|
|
|
eDisplayShortOption,
|
|
|
|
eDisplayLongOption
|
|
|
|
};
|
|
|
|
|
|
|
|
static bool PrintOption(const OptionDefinition &opt_def,
|
|
|
|
OptionDisplayType display_type, const char *header,
|
|
|
|
const char *footer, bool show_optional, Stream &strm) {
|
2020-11-10 20:17:07 +08:00
|
|
|
if (display_type == eDisplayShortOption && !opt_def.HasShortOption())
|
2012-12-04 08:32:51 +08:00
|
|
|
return false;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2012-12-04 08:32:51 +08:00
|
|
|
if (header && header[0])
|
|
|
|
strm.PutCString(header);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2012-12-04 08:32:51 +08:00
|
|
|
if (show_optional && !opt_def.required)
|
|
|
|
strm.PutChar('[');
|
|
|
|
const bool show_short_option =
|
2020-11-10 20:17:07 +08:00
|
|
|
opt_def.HasShortOption() && display_type != eDisplayLongOption;
|
2012-12-04 08:32:51 +08:00
|
|
|
if (show_short_option)
|
|
|
|
strm.Printf("-%c", opt_def.short_option);
|
|
|
|
else
|
|
|
|
strm.Printf("--%s", opt_def.long_option);
|
|
|
|
switch (opt_def.option_has_arg) {
|
2013-09-06 00:42:23 +08:00
|
|
|
case OptionParser::eNoArgument:
|
2012-12-04 08:32:51 +08:00
|
|
|
break;
|
2013-09-06 00:42:23 +08:00
|
|
|
case OptionParser::eRequiredArgument:
|
2012-12-04 08:32:51 +08:00
|
|
|
strm.Printf(" <%s>", CommandObject::GetArgumentName(opt_def.argument_type));
|
|
|
|
break;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2013-09-06 00:42:23 +08:00
|
|
|
case OptionParser::eOptionalArgument:
|
2012-12-04 08:32:51 +08:00
|
|
|
strm.Printf("%s[<%s>]", show_short_option ? "" : "=",
|
|
|
|
CommandObject::GetArgumentName(opt_def.argument_type));
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
if (show_optional && !opt_def.required)
|
|
|
|
strm.PutChar(']');
|
|
|
|
if (footer && footer[0])
|
|
|
|
strm.PutCString(footer);
|
|
|
|
return true;
|
|
|
|
}
|
|
|
|
|
2022-05-09 18:50:03 +08:00
|
|
|
void Options::GenerateOptionUsage(Stream &strm, CommandObject &cmd,
|
2016-08-12 07:51:28 +08:00
|
|
|
uint32_t screen_width) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
auto opt_defs = GetDefinitions();
|
2010-06-09 00:52:24 +08:00
|
|
|
const uint32_t save_indent_level = strm.GetIndentLevel();
|
2022-05-09 18:50:03 +08:00
|
|
|
llvm::StringRef name = cmd.GetCommandName();
|
Add infrastructure for standardizing arguments for commands and
command options; makes it easier to ensure that the same type of
argument will have the same name everywhere, hooks up help for command
arguments, so that users can ask for help when they are confused about
what an argument should be; puts in the beginnings of the ability to
do tab-completion for certain types of arguments, allows automatic
syntax help generation for commands with arguments, and adds command
arguments into command options help correctly.
Currently only the breakpoint-id and breakpoint-id-range arguments, in
the breakpoint commands, have been hooked up to use the new mechanism.
The next steps will be to fix the command options arguments to use
this mechanism, and to fix the rest of the regular command arguments
to use this mechanism. Most of the help text is currently missing or
dummy text; this will need to be filled in, and the existing argument
help text will need to be cleaned up a bit (it was thrown in quickly,
mostly for testing purposes).
Help command now works for all argument types, although the help may not
be very helpful yet.
Those commands that take "raw" command strings now indicate it in their
help text.
llvm-svn: 115318
2010-10-02 01:46:38 +08:00
|
|
|
StreamString arguments_str;
|
2022-05-09 18:50:03 +08:00
|
|
|
cmd.GetFormattedCommandArguments(arguments_str);
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2021-12-17 21:49:35 +08:00
|
|
|
const uint32_t num_options = NumCommandOptions();
|
|
|
|
if (num_options == 0)
|
|
|
|
return;
|
|
|
|
|
2022-05-09 18:58:21 +08:00
|
|
|
const bool only_print_args = cmd.IsDashDashCommand();
|
2021-12-17 21:49:35 +08:00
|
|
|
if (!only_print_args)
|
|
|
|
strm.PutCString("\nCommand Options Usage:\n");
|
2010-06-09 00:52:24 +08:00
|
|
|
|
|
|
|
strm.IndentMore(2);
|
|
|
|
|
2018-05-01 00:49:04 +08:00
|
|
|
// First, show each usage level set of options, e.g. <cmd> [options-for-
|
|
|
|
// level-0]
|
2010-06-09 00:52:24 +08:00
|
|
|
// <cmd>
|
|
|
|
// [options-for-level-1]
|
|
|
|
// etc.
|
|
|
|
|
2016-03-15 06:17:04 +08:00
|
|
|
if (!only_print_args) {
|
2022-05-09 18:58:21 +08:00
|
|
|
uint32_t num_option_sets = GetRequiredOptions().size();
|
2016-03-15 06:17:04 +08:00
|
|
|
for (uint32_t opt_set = 0; opt_set < num_option_sets; ++opt_set) {
|
|
|
|
if (opt_set > 0)
|
|
|
|
strm.Printf("\n");
|
|
|
|
strm.Indent(name);
|
2010-09-10 00:44:14 +08:00
|
|
|
|
2016-03-15 06:17:04 +08:00
|
|
|
// Different option sets may require different args.
|
|
|
|
StreamString args_str;
|
2022-05-09 18:58:21 +08:00
|
|
|
uint32_t opt_set_mask = 1 << opt_set;
|
2022-05-09 18:50:03 +08:00
|
|
|
cmd.GetFormattedCommandArguments(args_str, opt_set_mask);
|
2010-09-10 00:44:14 +08:00
|
|
|
|
2018-05-01 00:49:04 +08:00
|
|
|
// First go through and print all options that take no arguments as a
|
|
|
|
// single string. If a command has "-a" "-b" and "-c", this will show up
|
|
|
|
// as [-abc]
|
2010-09-10 00:44:14 +08:00
|
|
|
|
2022-04-11 20:55:55 +08:00
|
|
|
// We use a set here so that they will be sorted.
|
|
|
|
std::set<int> required_options;
|
|
|
|
std::set<int> optional_options;
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2022-04-11 20:55:55 +08:00
|
|
|
for (auto &def : opt_defs) {
|
|
|
|
if (def.usage_mask & opt_set_mask && def.HasShortOption() &&
|
|
|
|
def.option_has_arg == OptionParser::eNoArgument) {
|
|
|
|
if (def.required) {
|
|
|
|
required_options.insert(def.short_option);
|
|
|
|
} else {
|
|
|
|
optional_options.insert(def.short_option);
|
2016-03-15 06:17:04 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2022-04-11 20:55:55 +08:00
|
|
|
if (!required_options.empty()) {
|
2010-06-09 00:52:24 +08:00
|
|
|
strm.PutCString(" -");
|
2022-04-11 20:55:55 +08:00
|
|
|
for (int short_option : required_options)
|
|
|
|
strm.PutChar(short_option);
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
|
|
|
|
2022-04-11 20:55:55 +08:00
|
|
|
if (!optional_options.empty()) {
|
2016-03-15 06:17:04 +08:00
|
|
|
strm.PutCString(" [-");
|
2022-04-11 20:55:55 +08:00
|
|
|
for (int short_option : optional_options)
|
|
|
|
strm.PutChar(short_option);
|
2016-03-15 06:17:04 +08:00
|
|
|
strm.PutChar(']');
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
|
|
|
// First go through and print the required options (list them up front).
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
for (auto &def : opt_defs) {
|
2022-04-11 20:55:55 +08:00
|
|
|
if (def.usage_mask & opt_set_mask && def.HasShortOption() &&
|
|
|
|
def.required && def.option_has_arg != OptionParser::eNoArgument)
|
|
|
|
PrintOption(def, eDisplayBestOption, " ", nullptr, true, strm);
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
|
|
|
// Now go through again, and this time only print the optional options.
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
for (auto &def : opt_defs) {
|
2022-04-11 20:55:55 +08:00
|
|
|
if (def.usage_mask & opt_set_mask && !def.required &&
|
|
|
|
def.option_has_arg != OptionParser::eNoArgument)
|
|
|
|
PrintOption(def, eDisplayBestOption, " ", nullptr, true, strm);
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
if (args_str.GetSize() > 0) {
|
2022-05-09 18:58:21 +08:00
|
|
|
if (cmd.WantsRawCommandString())
|
2010-06-09 00:52:24 +08:00
|
|
|
strm.Printf(" --");
|
2016-11-17 05:15:24 +08:00
|
|
|
strm << " " << args_str.GetString();
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2022-05-09 18:50:03 +08:00
|
|
|
if ((only_print_args || cmd.WantsRawCommandString()) &&
|
2011-04-14 06:47:15 +08:00
|
|
|
arguments_str.GetSize() > 0) {
|
|
|
|
if (!only_print_args)
|
|
|
|
strm.PutChar('\n');
|
2015-05-13 08:25:54 +08:00
|
|
|
strm.Indent(name);
|
2016-11-17 05:15:24 +08:00
|
|
|
strm << " " << arguments_str.GetString();
|
2011-04-14 06:47:15 +08:00
|
|
|
}
|
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
if (!only_print_args) {
|
2021-12-17 21:49:35 +08:00
|
|
|
strm.Printf("\n\n");
|
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
// Now print out all the detailed information about the various options:
|
2013-06-20 03:04:53 +08:00
|
|
|
// long form, short form and help text:
|
|
|
|
// -short <argument> ( --long_name <argument> )
|
2010-06-09 00:52:24 +08:00
|
|
|
// help text
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2016-06-30 04:23:03 +08:00
|
|
|
strm.IndentMore(5);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2022-04-08 23:43:16 +08:00
|
|
|
// Put the command options in a sorted container, so we can output
|
|
|
|
// them alphabetically by short_option.
|
|
|
|
std::multimap<int, uint32_t> options_ordered;
|
|
|
|
for (auto def : llvm::enumerate(opt_defs))
|
|
|
|
options_ordered.insert(
|
|
|
|
std::make_pair(def.value().short_option, def.index()));
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2022-04-08 23:43:16 +08:00
|
|
|
// Go through each option, find the table entry and write out the detailed
|
|
|
|
// help information for that option.
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2016-03-15 06:17:04 +08:00
|
|
|
bool first_option_printed = false;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2022-04-08 23:43:16 +08:00
|
|
|
for (auto pos : options_ordered) {
|
2010-06-09 00:52:24 +08:00
|
|
|
// Put a newline separation between arguments
|
2014-01-30 02:25:07 +08:00
|
|
|
if (first_option_printed)
|
|
|
|
strm.EOL();
|
2016-09-07 04:57:50 +08:00
|
|
|
else
|
2014-01-30 02:25:07 +08:00
|
|
|
first_option_printed = true;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2022-04-08 23:43:16 +08:00
|
|
|
OptionDefinition opt_def = opt_defs[pos.second];
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
strm.Indent();
|
2022-04-08 23:43:16 +08:00
|
|
|
if (opt_def.short_option && opt_def.HasShortOption()) {
|
|
|
|
PrintOption(opt_def, eDisplayShortOption, nullptr, nullptr, false,
|
2016-09-07 04:57:50 +08:00
|
|
|
strm);
|
2022-04-08 23:43:16 +08:00
|
|
|
PrintOption(opt_def, eDisplayLongOption, " ( ", " )", false, strm);
|
2016-08-12 07:51:28 +08:00
|
|
|
} else {
|
|
|
|
// Short option is not printable, just print long option
|
2022-04-08 23:43:16 +08:00
|
|
|
PrintOption(opt_def, eDisplayLongOption, nullptr, nullptr, false, strm);
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2016-08-12 07:51:28 +08:00
|
|
|
strm.EOL();
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2016-03-15 06:17:04 +08:00
|
|
|
strm.IndentMore(5);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2022-04-08 23:43:16 +08:00
|
|
|
if (opt_def.usage_text)
|
|
|
|
OutputFormattedUsageText(strm, opt_def, screen_width);
|
|
|
|
if (!opt_def.enum_values.empty()) {
|
2010-06-09 00:52:24 +08:00
|
|
|
strm.Indent();
|
2016-03-15 06:17:04 +08:00
|
|
|
strm.Printf("Values: ");
|
2018-09-27 02:50:19 +08:00
|
|
|
bool is_first = true;
|
2022-04-08 23:43:16 +08:00
|
|
|
for (const auto &enum_value : opt_def.enum_values) {
|
2018-09-27 02:50:19 +08:00
|
|
|
if (is_first) {
|
|
|
|
strm.Printf("%s", enum_value.string_value);
|
|
|
|
is_first = false;
|
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
else
|
2018-09-27 02:50:19 +08:00
|
|
|
strm.Printf(" | %s", enum_value.string_value);
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
2016-03-15 06:17:04 +08:00
|
|
|
strm.EOL();
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2016-03-15 06:17:04 +08:00
|
|
|
strm.IndentLess(5);
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-09 00:52:24 +08:00
|
|
|
|
2016-08-12 07:51:28 +08:00
|
|
|
// Restore the indent level
|
2010-06-23 09:19:29 +08:00
|
|
|
strm.SetIndentLevel(save_indent_level);
|
2010-06-09 00:52:24 +08:00
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
// This function is called when we have been given a potentially incomplete set
|
2018-05-01 00:49:04 +08:00
|
|
|
// of options, such as when an alias has been defined (more options might be
|
|
|
|
// added at at the time the alias is invoked). We need to verify that the
|
|
|
|
// options in the set m_seen_options are all part of a set that may be used
|
|
|
|
// together, but m_seen_options may be missing some of the "required" options.
|
2011-04-13 08:18:08 +08:00
|
|
|
|
2011-04-28 06:04:39 +08:00
|
|
|
bool Options::VerifyPartialOptions(CommandReturnObject &result) {
|
|
|
|
bool options_are_valid = false;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2011-04-28 06:04:39 +08:00
|
|
|
int num_levels = GetRequiredOptions().size();
|
2010-06-09 00:52:24 +08:00
|
|
|
if (num_levels) {
|
2011-04-28 06:04:39 +08:00
|
|
|
for (int i = 0; i < num_levels && !options_are_valid; ++i) {
|
|
|
|
// In this case we are treating all options as optional rather than
|
2018-05-01 00:49:04 +08:00
|
|
|
// required. Therefore a set of options is correct if m_seen_options is a
|
|
|
|
// subset of the union of m_required_options and m_optional_options.
|
2010-06-09 00:52:24 +08:00
|
|
|
OptionSet union_set;
|
2011-04-28 06:04:39 +08:00
|
|
|
OptionsSetUnion(GetRequiredOptions()[i], GetOptionalOptions()[i],
|
|
|
|
union_set);
|
|
|
|
if (IsASubset(m_seen_options, union_set))
|
|
|
|
options_are_valid = true;
|
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
return options_are_valid;
|
2011-04-28 06:04:39 +08:00
|
|
|
}
|
|
|
|
|
2018-07-14 02:28:14 +08:00
|
|
|
bool Options::HandleOptionCompletion(CompletionRequest &request,
|
|
|
|
OptionElementVector &opt_element_vector,
|
|
|
|
CommandInterpreter &interpreter) {
|
2013-08-27 07:57:52 +08:00
|
|
|
// For now we just scan the completions to see if the cursor position is in
|
|
|
|
// an option or its argument. Otherwise we'll call HandleArgumentCompletion.
|
2018-05-01 00:49:04 +08:00
|
|
|
// In the future we can use completion to validate options as well if we
|
|
|
|
// want.
|
2016-09-07 04:57:50 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
auto opt_defs = GetDefinitions();
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2019-08-28 17:32:30 +08:00
|
|
|
llvm::StringRef cur_opt_str = request.GetCursorArgumentPrefix();
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2013-08-27 07:57:52 +08:00
|
|
|
for (size_t i = 0; i < opt_element_vector.size(); i++) {
|
2019-09-23 17:46:17 +08:00
|
|
|
size_t opt_pos = static_cast<size_t>(opt_element_vector[i].opt_pos);
|
|
|
|
size_t opt_arg_pos = static_cast<size_t>(opt_element_vector[i].opt_arg_pos);
|
2010-06-09 00:52:24 +08:00
|
|
|
int opt_defs_index = opt_element_vector[i].opt_defs_index;
|
2018-07-14 02:28:14 +08:00
|
|
|
if (opt_pos == request.GetCursorIndex()) {
|
2010-06-09 00:52:24 +08:00
|
|
|
// We're completing the option itself.
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-25 04:30:15 +08:00
|
|
|
if (opt_defs_index == OptionArgElement::eBareDash) {
|
|
|
|
// We're completing a bare dash. That means all options are open.
|
|
|
|
// FIXME: We should scan the other options provided and only complete
|
2010-06-09 00:52:24 +08:00
|
|
|
// options
|
2010-06-25 04:30:15 +08:00
|
|
|
// within the option group they belong to.
|
2019-08-28 17:32:30 +08:00
|
|
|
std::string opt_str = "-a";
|
2016-09-07 04:57:50 +08:00
|
|
|
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
for (auto &def : opt_defs) {
|
|
|
|
if (!def.short_option)
|
|
|
|
continue;
|
|
|
|
opt_str[1] = def.short_option;
|
2019-09-02 16:34:57 +08:00
|
|
|
request.AddCompletion(opt_str, def.usage_text);
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
|
2010-06-25 04:30:15 +08:00
|
|
|
return true;
|
|
|
|
} else if (opt_defs_index == OptionArgElement::eBareDoubleDash) {
|
|
|
|
std::string full_name("--");
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
for (auto &def : opt_defs) {
|
|
|
|
if (!def.short_option)
|
|
|
|
continue;
|
|
|
|
|
2010-06-25 04:30:15 +08:00
|
|
|
full_name.erase(full_name.begin() + 2, full_name.end());
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
full_name.append(def.long_option);
|
2019-09-02 16:34:57 +08:00
|
|
|
request.AddCompletion(full_name, def.usage_text);
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2010-06-25 04:30:15 +08:00
|
|
|
return true;
|
|
|
|
} else if (opt_defs_index != OptionArgElement::eUnrecognizedArg) {
|
2018-05-01 00:49:04 +08:00
|
|
|
// We recognized it, if it an incomplete long option, complete it
|
|
|
|
// anyway (getopt_long_only is happy with shortest unique string, but
|
|
|
|
// it's still a nice thing to do.) Otherwise return The string so the
|
|
|
|
// upper level code will know this is a full match and add the " ".
|
2019-09-02 16:34:57 +08:00
|
|
|
const OptionDefinition &opt = opt_defs[opt_defs_index];
|
|
|
|
llvm::StringRef long_option = opt.long_option;
|
2019-08-28 17:32:30 +08:00
|
|
|
if (cur_opt_str.startswith("--") && cur_opt_str != long_option) {
|
2019-09-02 16:34:57 +08:00
|
|
|
request.AddCompletion("--" + long_option.str(), opt.usage_text);
|
2010-06-25 04:30:15 +08:00
|
|
|
return true;
|
2019-08-28 17:32:30 +08:00
|
|
|
} else
|
2019-09-25 20:55:30 +08:00
|
|
|
request.AddCompletion(request.GetCursorArgumentPrefix());
|
2019-08-28 17:32:30 +08:00
|
|
|
return true;
|
2016-09-07 04:57:50 +08:00
|
|
|
} else {
|
2013-08-27 07:57:52 +08:00
|
|
|
// FIXME - not handling wrong options yet:
|
|
|
|
// Check to see if they are writing a long option & complete it.
|
2010-06-09 00:52:24 +08:00
|
|
|
// I think we will only get in here if the long option table has two
|
|
|
|
// elements
|
2018-05-01 00:49:04 +08:00
|
|
|
// that are not unique up to this point. getopt_long_only does
|
|
|
|
// shortest unique match for long options already.
|
2019-08-28 18:17:23 +08:00
|
|
|
if (cur_opt_str.consume_front("--")) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
for (auto &def : opt_defs) {
|
2019-08-28 18:17:23 +08:00
|
|
|
llvm::StringRef long_option(def.long_option);
|
|
|
|
if (long_option.startswith(cur_opt_str))
|
2019-09-02 16:34:57 +08:00
|
|
|
request.AddCompletion("--" + long_option.str(), def.usage_text);
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
}
|
2013-08-27 07:57:52 +08:00
|
|
|
return true;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2018-07-14 02:28:14 +08:00
|
|
|
} else if (opt_arg_pos == request.GetCursorIndex()) {
|
2018-05-01 00:49:04 +08:00
|
|
|
// Okay the cursor is on the completion of an argument. See if it has a
|
|
|
|
// completion, otherwise return no matches.
|
2010-06-09 00:52:24 +08:00
|
|
|
if (opt_defs_index != -1) {
|
2019-09-25 20:04:48 +08:00
|
|
|
HandleOptionArgumentCompletion(request, opt_element_vector, i,
|
2018-07-14 02:28:14 +08:00
|
|
|
interpreter);
|
2010-06-25 04:30:15 +08:00
|
|
|
return true;
|
2016-09-07 04:57:50 +08:00
|
|
|
} else {
|
2010-06-09 00:52:24 +08:00
|
|
|
// No completion callback means no completions...
|
2010-06-25 04:30:15 +08:00
|
|
|
return true;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
} else {
|
2010-06-09 00:52:24 +08:00
|
|
|
// Not the last element, keep going.
|
2016-06-30 04:23:03 +08:00
|
|
|
continue;
|
2013-08-27 07:57:52 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2014-04-20 08:31:37 +08:00
|
|
|
return false;
|
2013-08-27 07:57:52 +08:00
|
|
|
}
|
|
|
|
|
2019-08-22 17:14:42 +08:00
|
|
|
void Options::HandleOptionArgumentCompletion(
|
2018-07-14 02:28:14 +08:00
|
|
|
CompletionRequest &request, OptionElementVector &opt_element_vector,
|
|
|
|
int opt_element_index, CommandInterpreter &interpreter) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
auto opt_defs = GetDefinitions();
|
2019-02-13 14:25:41 +08:00
|
|
|
std::unique_ptr<SearchFilter> filter_up;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
int opt_defs_index = opt_element_vector[opt_element_index].opt_defs_index;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2010-06-09 00:52:24 +08:00
|
|
|
// See if this is an enumeration type option, and if so complete it here:
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2018-09-27 02:50:19 +08:00
|
|
|
const auto &enum_values = opt_defs[opt_defs_index].enum_values;
|
2019-09-23 17:56:53 +08:00
|
|
|
if (!enum_values.empty())
|
|
|
|
for (const auto &enum_value : enum_values)
|
2019-09-23 16:59:21 +08:00
|
|
|
request.TryCompleteCurrentArg(enum_value.string_value);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2018-05-01 00:49:04 +08:00
|
|
|
// If this is a source file or symbol type completion, and there is a -shlib
|
|
|
|
// option somewhere in the supplied arguments, then make a search filter for
|
|
|
|
// that shared library.
|
2011-04-13 08:18:08 +08:00
|
|
|
// FIXME: Do we want to also have an "OptionType" so we don't have to match
|
2010-06-09 00:52:24 +08:00
|
|
|
// string names?
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2011-04-13 08:18:08 +08:00
|
|
|
uint32_t completion_mask = opt_defs[opt_defs_index].completion_type;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2011-04-13 08:18:08 +08:00
|
|
|
if (completion_mask == 0) {
|
|
|
|
lldb::CommandArgumentType option_arg_type =
|
|
|
|
opt_defs[opt_defs_index].argument_type;
|
|
|
|
if (option_arg_type != eArgTypeNone) {
|
|
|
|
const CommandObject::ArgumentTableEntry *arg_entry =
|
2015-05-13 08:25:54 +08:00
|
|
|
CommandObject::FindArgumentDataByType(
|
2011-04-14 06:47:15 +08:00
|
|
|
opt_defs[opt_defs_index].argument_type);
|
|
|
|
if (arg_entry)
|
|
|
|
completion_mask = arg_entry->completion_type;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2011-04-14 06:47:15 +08:00
|
|
|
if (completion_mask & CommandCompletions::eSourceFileCompletion ||
|
|
|
|
completion_mask & CommandCompletions::eSymbolCompletion) {
|
|
|
|
for (size_t i = 0; i < opt_element_vector.size(); i++) {
|
2010-06-09 00:52:24 +08:00
|
|
|
int cur_defs_index = opt_element_vector[i].opt_defs_index;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2011-04-14 06:47:15 +08:00
|
|
|
// trying to use <0 indices will definitely cause problems
|
|
|
|
if (cur_defs_index == OptionArgElement::eUnrecognizedArg ||
|
|
|
|
cur_defs_index == OptionArgElement::eBareDash ||
|
|
|
|
cur_defs_index == OptionArgElement::eBareDoubleDash)
|
2016-06-30 04:23:03 +08:00
|
|
|
continue;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2011-04-14 06:47:15 +08:00
|
|
|
int cur_arg_pos = opt_element_vector[i].opt_arg_pos;
|
|
|
|
const char *cur_opt_name = opt_defs[cur_defs_index].long_option;
|
2016-09-07 04:57:50 +08:00
|
|
|
|
2011-04-14 06:47:15 +08:00
|
|
|
// If this is the "shlib" option and there was an argument provided,
|
2010-06-09 00:52:24 +08:00
|
|
|
// restrict it to that shared library.
|
2011-04-14 06:47:15 +08:00
|
|
|
if (cur_opt_name && strcmp(cur_opt_name, "shlib") == 0 &&
|
2010-06-09 00:52:24 +08:00
|
|
|
cur_arg_pos != -1) {
|
2018-07-14 02:28:14 +08:00
|
|
|
const char *module_name =
|
|
|
|
request.GetParsedLine().GetArgumentAtIndex(cur_arg_pos);
|
2010-06-09 00:52:24 +08:00
|
|
|
if (module_name) {
|
2018-11-02 05:05:36 +08:00
|
|
|
FileSpec module_spec(module_name);
|
2016-08-12 07:51:28 +08:00
|
|
|
lldb::TargetSP target_sp =
|
2011-04-14 06:47:15 +08:00
|
|
|
interpreter.GetDebugger().GetSelectedTarget();
|
2010-06-09 00:52:24 +08:00
|
|
|
// Search filters require a target...
|
|
|
|
if (target_sp)
|
2020-06-25 08:44:33 +08:00
|
|
|
filter_up =
|
|
|
|
std::make_unique<SearchFilterByModule>(target_sp, module_spec);
|
2011-04-14 06:47:15 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
break;
|
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2019-08-22 17:14:42 +08:00
|
|
|
CommandCompletions::InvokeCommonCompletionCallbacks(
|
2019-02-13 14:25:41 +08:00
|
|
|
interpreter, completion_mask, request, filter_up.get());
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
void OptionGroupOptions::Append(OptionGroup *group) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
auto group_option_defs = group->GetDefinitions();
|
|
|
|
for (uint32_t i = 0; i < group_option_defs.size(); ++i) {
|
2014-07-10 00:31:49 +08:00
|
|
|
m_option_infos.push_back(OptionInfo(group, i));
|
2011-04-13 08:18:08 +08:00
|
|
|
m_option_defs.push_back(group_option_defs[i]);
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
const OptionGroup *OptionGroupOptions::GetGroupWithOption(char short_opt) {
|
|
|
|
for (uint32_t i = 0; i < m_option_defs.size(); i++) {
|
2016-08-12 07:51:28 +08:00
|
|
|
OptionDefinition opt_def = m_option_defs[i];
|
|
|
|
if (opt_def.short_option == short_opt)
|
2011-04-13 08:18:08 +08:00
|
|
|
return m_option_infos[i].option_group;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
return nullptr;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2011-04-13 08:18:08 +08:00
|
|
|
void OptionGroupOptions::Append(OptionGroup *group, uint32_t src_mask,
|
2011-04-14 06:47:15 +08:00
|
|
|
uint32_t dst_mask) {
|
Convert option tables to ArrayRefs.
This change is very mechanical. All it does is change the
signature of `Options::GetDefinitions()` and `OptionGroup::
GetDefinitions()` to return an `ArrayRef<OptionDefinition>`
instead of a `const OptionDefinition *`. In the case of the
former, it deletes the sentinel entry from every table, and
in the case of the latter, it removes the `GetNumDefinitions()`
method from the interface. These are no longer necessary as
`ArrayRef` carries its own length.
In the former case, iteration was done by using a sentinel
entry, so there was no knowledge of length. Because of this
the individual option tables were allowed to be defined below
the corresponding class (after all, only a pointer was needed).
Now, however, the length must be known at compile time to
construct the `ArrayRef`, and as a result it is necessary to
move every option table before its corresponding class. This
results in this CL looking very big, but in terms of substance
there is not much here.
Differential revision: https://reviews.llvm.org/D24834
llvm-svn: 282188
2016-09-23 04:22:55 +08:00
|
|
|
auto group_option_defs = group->GetDefinitions();
|
|
|
|
for (uint32_t i = 0; i < group_option_defs.size(); ++i) {
|
2016-08-12 07:51:28 +08:00
|
|
|
if (group_option_defs[i].usage_mask & src_mask) {
|
|
|
|
m_option_infos.push_back(OptionInfo(group, i));
|
2011-04-14 06:47:15 +08:00
|
|
|
m_option_defs.push_back(group_option_defs[i]);
|
|
|
|
m_option_defs.back().usage_mask = dst_mask;
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
void OptionGroupOptions::Finalize() {
|
|
|
|
m_did_finalize = true;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2017-05-12 12:51:55 +08:00
|
|
|
Status OptionGroupOptions::SetOptionValue(uint32_t option_idx,
|
|
|
|
llvm::StringRef option_value,
|
|
|
|
ExecutionContext *execution_context) {
|
2018-05-01 00:49:04 +08:00
|
|
|
// After calling OptionGroupOptions::Append(...), you must finalize the
|
|
|
|
// groups by calling OptionGroupOptions::Finlize()
|
2011-04-13 08:18:08 +08:00
|
|
|
assert(m_did_finalize);
|
2017-05-12 12:51:55 +08:00
|
|
|
Status error;
|
2016-08-12 07:51:28 +08:00
|
|
|
if (option_idx < m_option_infos.size()) {
|
2012-12-04 08:32:51 +08:00
|
|
|
error = m_option_infos[option_idx].option_group->SetOptionValue(
|
2016-11-13 00:56:47 +08:00
|
|
|
m_option_infos[option_idx].option_index, option_value,
|
|
|
|
execution_context);
|
2016-09-07 04:57:50 +08:00
|
|
|
|
|
|
|
} else {
|
2011-04-14 06:47:15 +08:00
|
|
|
error.SetErrorString("invalid option index"); // Shouldn't happen...
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
return error;
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
|
|
|
|
2016-08-12 07:51:28 +08:00
|
|
|
void OptionGroupOptions::OptionParsingStarting(
|
2011-04-13 08:18:08 +08:00
|
|
|
ExecutionContext *execution_context) {
|
2011-04-14 06:47:15 +08:00
|
|
|
std::set<OptionGroup *> group_set;
|
|
|
|
OptionInfos::iterator pos, end = m_option_infos.end();
|
|
|
|
for (pos = m_option_infos.begin(); pos != end; ++pos) {
|
|
|
|
OptionGroup *group = pos->option_group;
|
|
|
|
if (group_set.find(group) == group_set.end()) {
|
2016-08-12 07:51:28 +08:00
|
|
|
group->OptionParsingStarting(execution_context);
|
2011-04-14 06:47:15 +08:00
|
|
|
group_set.insert(group);
|
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
2017-05-12 12:51:55 +08:00
|
|
|
Status
|
|
|
|
OptionGroupOptions::OptionParsingFinished(ExecutionContext *execution_context) {
|
2011-04-14 06:47:15 +08:00
|
|
|
std::set<OptionGroup *> group_set;
|
2017-05-12 12:51:55 +08:00
|
|
|
Status error;
|
2011-04-14 06:47:15 +08:00
|
|
|
OptionInfos::iterator pos, end = m_option_infos.end();
|
|
|
|
for (pos = m_option_infos.begin(); pos != end; ++pos) {
|
|
|
|
OptionGroup *group = pos->option_group;
|
|
|
|
if (group_set.find(group) == group_set.end()) {
|
2016-08-12 07:51:28 +08:00
|
|
|
error = group->OptionParsingFinished(execution_context);
|
2011-04-14 06:47:15 +08:00
|
|
|
group_set.insert(group);
|
|
|
|
if (error.Fail())
|
|
|
|
return error;
|
2011-04-13 08:18:08 +08:00
|
|
|
}
|
2016-09-07 04:57:50 +08:00
|
|
|
}
|
2011-04-13 08:18:08 +08:00
|
|
|
return error;
|
|
|
|
}
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
|
|
|
// OptionParser permutes the arguments while processing them, so we create a
|
|
|
|
// temporary array holding to avoid modification of the input arguments. The
|
|
|
|
// options themselves are never modified, but the API expects a char * anyway,
|
|
|
|
// hence the const_cast.
|
|
|
|
static std::vector<char *> GetArgvForParsing(const Args &args) {
|
|
|
|
std::vector<char *> result;
|
|
|
|
// OptionParser always skips the first argument as it is based on getopt().
|
|
|
|
result.push_back(const_cast<char *>("<FAKE-ARG0>"));
|
|
|
|
for (const Args::ArgEntry &entry : args)
|
|
|
|
result.push_back(const_cast<char *>(entry.c_str()));
|
2019-07-11 01:09:47 +08:00
|
|
|
result.push_back(nullptr);
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
return result;
|
|
|
|
}
|
|
|
|
|
|
|
|
// Given a permuted argument, find it's position in the original Args vector.
|
|
|
|
static Args::const_iterator FindOriginalIter(const char *arg,
|
|
|
|
const Args &original) {
|
|
|
|
return llvm::find_if(
|
|
|
|
original, [arg](const Args::ArgEntry &D) { return D.c_str() == arg; });
|
|
|
|
}
|
|
|
|
|
|
|
|
// Given a permuted argument, find it's index in the original Args vector.
|
|
|
|
static size_t FindOriginalIndex(const char *arg, const Args &original) {
|
|
|
|
return std::distance(original.begin(), FindOriginalIter(arg, original));
|
|
|
|
}
|
|
|
|
|
|
|
|
// Construct a new Args object, consisting of the entries from the original
|
|
|
|
// arguments, but in the permuted order.
|
|
|
|
static Args ReconstituteArgsAfterParsing(llvm::ArrayRef<char *> parsed,
|
|
|
|
const Args &original) {
|
|
|
|
Args result;
|
|
|
|
for (const char *arg : parsed) {
|
|
|
|
auto pos = FindOriginalIter(arg, original);
|
|
|
|
assert(pos != original.end());
|
2019-09-13 19:26:48 +08:00
|
|
|
result.AppendArgument(pos->ref(), pos->GetQuoteChar());
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
}
|
|
|
|
return result;
|
|
|
|
}
|
|
|
|
|
|
|
|
static size_t FindArgumentIndexForOption(const Args &args,
|
|
|
|
const Option &long_option) {
|
|
|
|
std::string short_opt = llvm::formatv("-{0}", char(long_option.val)).str();
|
|
|
|
std::string long_opt =
|
2020-01-29 03:23:46 +08:00
|
|
|
std::string(llvm::formatv("--{0}", long_option.definition->long_option));
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
for (const auto &entry : llvm::enumerate(args)) {
|
2019-09-13 19:26:48 +08:00
|
|
|
if (entry.value().ref().startswith(short_opt) ||
|
|
|
|
entry.value().ref().startswith(long_opt))
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
return entry.index();
|
|
|
|
}
|
|
|
|
|
|
|
|
return size_t(-1);
|
|
|
|
}
|
|
|
|
|
2019-07-11 01:09:47 +08:00
|
|
|
static std::string BuildShortOptions(const Option *long_options) {
|
|
|
|
std::string storage;
|
|
|
|
llvm::raw_string_ostream sstr(storage);
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
2019-07-11 01:09:47 +08:00
|
|
|
// Leading : tells getopt to return a : for a missing option argument AND to
|
|
|
|
// suppress error messages.
|
|
|
|
sstr << ":";
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
2019-07-11 01:09:47 +08:00
|
|
|
for (size_t i = 0; long_options[i].definition != nullptr; ++i) {
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
if (long_options[i].flag == nullptr) {
|
|
|
|
sstr << (char)long_options[i].val;
|
|
|
|
switch (long_options[i].definition->option_has_arg) {
|
|
|
|
default:
|
|
|
|
case OptionParser::eNoArgument:
|
|
|
|
break;
|
|
|
|
case OptionParser::eRequiredArgument:
|
|
|
|
sstr << ":";
|
|
|
|
break;
|
|
|
|
case OptionParser::eOptionalArgument:
|
|
|
|
sstr << "::";
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2019-07-11 01:09:47 +08:00
|
|
|
return std::move(sstr.str());
|
|
|
|
}
|
|
|
|
|
|
|
|
llvm::Expected<Args> Options::ParseAlias(const Args &args,
|
|
|
|
OptionArgVector *option_arg_vector,
|
|
|
|
std::string &input_line) {
|
|
|
|
Option *long_options = GetLongOptions();
|
|
|
|
|
|
|
|
if (long_options == nullptr) {
|
|
|
|
return llvm::make_error<llvm::StringError>("Invalid long options",
|
|
|
|
llvm::inconvertibleErrorCode());
|
|
|
|
}
|
|
|
|
|
|
|
|
std::string short_options = BuildShortOptions(long_options);
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
|
|
|
Args args_copy = args;
|
|
|
|
std::vector<char *> argv = GetArgvForParsing(args);
|
|
|
|
|
|
|
|
std::unique_lock<std::mutex> lock;
|
|
|
|
OptionParser::Prepare(lock);
|
|
|
|
int val;
|
2019-05-24 08:44:33 +08:00
|
|
|
while (true) {
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
int long_options_index = -1;
|
2019-07-11 01:09:47 +08:00
|
|
|
val = OptionParser::Parse(argv, short_options, long_options,
|
|
|
|
&long_options_index);
|
|
|
|
|
|
|
|
if (val == ':') {
|
|
|
|
return llvm::createStringError(llvm::inconvertibleErrorCode(),
|
|
|
|
"last option requires an argument");
|
|
|
|
}
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
|
|
|
if (val == -1)
|
|
|
|
break;
|
|
|
|
|
|
|
|
if (val == '?') {
|
|
|
|
return llvm::make_error<llvm::StringError>(
|
|
|
|
"Unknown or ambiguous option", llvm::inconvertibleErrorCode());
|
|
|
|
}
|
|
|
|
|
|
|
|
if (val == 0)
|
|
|
|
continue;
|
|
|
|
|
|
|
|
OptionSeen(val);
|
|
|
|
|
|
|
|
// Look up the long option index
|
|
|
|
if (long_options_index == -1) {
|
|
|
|
for (int j = 0; long_options[j].definition || long_options[j].flag ||
|
|
|
|
long_options[j].val;
|
|
|
|
++j) {
|
|
|
|
if (long_options[j].val == val) {
|
|
|
|
long_options_index = j;
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
// See if the option takes an argument, and see if one was supplied.
|
|
|
|
if (long_options_index == -1) {
|
|
|
|
return llvm::make_error<llvm::StringError>(
|
|
|
|
llvm::formatv("Invalid option with value '{0}'.", char(val)).str(),
|
|
|
|
llvm::inconvertibleErrorCode());
|
|
|
|
}
|
|
|
|
|
|
|
|
StreamString option_str;
|
|
|
|
option_str.Printf("-%c", val);
|
|
|
|
const OptionDefinition *def = long_options[long_options_index].definition;
|
|
|
|
int has_arg =
|
|
|
|
(def == nullptr) ? OptionParser::eNoArgument : def->option_has_arg;
|
|
|
|
|
|
|
|
const char *option_arg = nullptr;
|
|
|
|
switch (has_arg) {
|
|
|
|
case OptionParser::eRequiredArgument:
|
|
|
|
if (OptionParser::GetOptionArgument() == nullptr) {
|
|
|
|
return llvm::make_error<llvm::StringError>(
|
|
|
|
llvm::formatv("Option '{0}' is missing argument specifier.",
|
|
|
|
option_str.GetString())
|
|
|
|
.str(),
|
|
|
|
llvm::inconvertibleErrorCode());
|
|
|
|
}
|
|
|
|
LLVM_FALLTHROUGH;
|
|
|
|
case OptionParser::eOptionalArgument:
|
|
|
|
option_arg = OptionParser::GetOptionArgument();
|
|
|
|
LLVM_FALLTHROUGH;
|
|
|
|
case OptionParser::eNoArgument:
|
|
|
|
break;
|
|
|
|
default:
|
|
|
|
return llvm::make_error<llvm::StringError>(
|
|
|
|
llvm::formatv("error with options table; invalid value in has_arg "
|
|
|
|
"field for option '{0}'.",
|
|
|
|
char(val))
|
|
|
|
.str(),
|
|
|
|
llvm::inconvertibleErrorCode());
|
|
|
|
}
|
|
|
|
if (!option_arg)
|
|
|
|
option_arg = "<no-argument>";
|
2020-02-06 04:29:59 +08:00
|
|
|
option_arg_vector->emplace_back(std::string(option_str.GetString()),
|
|
|
|
has_arg, std::string(option_arg));
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
2018-05-01 00:49:04 +08:00
|
|
|
// Find option in the argument list; also see if it was supposed to take an
|
|
|
|
// argument and if one was supplied. Remove option (and argument, if
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
// given) from the argument list. Also remove them from the
|
|
|
|
// raw_input_string, if one was passed in.
|
|
|
|
size_t idx =
|
|
|
|
FindArgumentIndexForOption(args_copy, long_options[long_options_index]);
|
|
|
|
if (idx == size_t(-1))
|
|
|
|
continue;
|
|
|
|
|
|
|
|
if (!input_line.empty()) {
|
2019-09-13 19:26:48 +08:00
|
|
|
auto tmp_arg = args_copy[idx].ref();
|
2020-01-29 03:23:46 +08:00
|
|
|
size_t pos = input_line.find(std::string(tmp_arg));
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
if (pos != std::string::npos)
|
|
|
|
input_line.erase(pos, tmp_arg.size());
|
|
|
|
}
|
|
|
|
args_copy.DeleteArgumentAtIndex(idx);
|
|
|
|
if ((long_options[long_options_index].definition->option_has_arg !=
|
|
|
|
OptionParser::eNoArgument) &&
|
|
|
|
(OptionParser::GetOptionArgument() != nullptr) &&
|
|
|
|
(idx < args_copy.GetArgumentCount()) &&
|
2019-09-13 19:26:48 +08:00
|
|
|
(args_copy[idx].ref() == OptionParser::GetOptionArgument())) {
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
if (input_line.size() > 0) {
|
2019-09-13 19:26:48 +08:00
|
|
|
auto tmp_arg = args_copy[idx].ref();
|
2020-01-29 03:23:46 +08:00
|
|
|
size_t pos = input_line.find(std::string(tmp_arg));
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
if (pos != std::string::npos)
|
|
|
|
input_line.erase(pos, tmp_arg.size());
|
|
|
|
}
|
|
|
|
args_copy.DeleteArgumentAtIndex(idx);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
return std::move(args_copy);
|
|
|
|
}
|
|
|
|
|
|
|
|
OptionElementVector Options::ParseForCompletion(const Args &args,
|
|
|
|
uint32_t cursor_index) {
|
|
|
|
OptionElementVector option_element_vector;
|
|
|
|
Option *long_options = GetLongOptions();
|
|
|
|
option_element_vector.clear();
|
|
|
|
|
|
|
|
if (long_options == nullptr)
|
|
|
|
return option_element_vector;
|
|
|
|
|
2019-07-11 01:09:47 +08:00
|
|
|
std::string short_options = BuildShortOptions(long_options);
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
|
|
|
std::unique_lock<std::mutex> lock;
|
|
|
|
OptionParser::Prepare(lock);
|
|
|
|
OptionParser::EnableError(false);
|
|
|
|
|
|
|
|
int val;
|
|
|
|
auto opt_defs = GetDefinitions();
|
|
|
|
|
|
|
|
std::vector<char *> dummy_vec = GetArgvForParsing(args);
|
|
|
|
|
|
|
|
bool failed_once = false;
|
|
|
|
uint32_t dash_dash_pos = -1;
|
|
|
|
|
2019-05-24 08:44:33 +08:00
|
|
|
while (true) {
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
bool missing_argument = false;
|
|
|
|
int long_options_index = -1;
|
|
|
|
|
2019-07-11 01:09:47 +08:00
|
|
|
val = OptionParser::Parse(dummy_vec, short_options, long_options,
|
|
|
|
&long_options_index);
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
|
|
|
if (val == -1) {
|
|
|
|
// When we're completing a "--" which is the last option on line,
|
|
|
|
if (failed_once)
|
|
|
|
break;
|
|
|
|
|
|
|
|
failed_once = true;
|
|
|
|
|
|
|
|
// If this is a bare "--" we mark it as such so we can complete it
|
|
|
|
// successfully later. Handling the "--" is a little tricky, since that
|
|
|
|
// may mean end of options or arguments, or the user might want to
|
|
|
|
// complete options by long name. I make this work by checking whether
|
|
|
|
// the cursor is in the "--" argument, and if so I assume we're
|
|
|
|
// completing the long option, otherwise I let it pass to
|
|
|
|
// OptionParser::Parse which will terminate the option parsing. Note, in
|
|
|
|
// either case we continue parsing the line so we can figure out what
|
|
|
|
// other options were passed. This will be useful when we come to
|
|
|
|
// restricting completions based on what other options we've seen on the
|
|
|
|
// line.
|
|
|
|
|
|
|
|
if (static_cast<size_t>(OptionParser::GetOptionIndex()) <
|
|
|
|
dummy_vec.size() &&
|
|
|
|
(strcmp(dummy_vec[OptionParser::GetOptionIndex() - 1], "--") == 0)) {
|
|
|
|
dash_dash_pos = FindOriginalIndex(
|
|
|
|
dummy_vec[OptionParser::GetOptionIndex() - 1], args);
|
|
|
|
if (dash_dash_pos == cursor_index) {
|
|
|
|
option_element_vector.push_back(
|
|
|
|
OptionArgElement(OptionArgElement::eBareDoubleDash, dash_dash_pos,
|
|
|
|
OptionArgElement::eBareDoubleDash));
|
|
|
|
continue;
|
|
|
|
} else
|
|
|
|
break;
|
|
|
|
} else
|
|
|
|
break;
|
|
|
|
} else if (val == '?') {
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
OptionArgElement::eUnrecognizedArg,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 1],
|
|
|
|
args),
|
|
|
|
OptionArgElement::eUnrecognizedArg));
|
|
|
|
continue;
|
|
|
|
} else if (val == 0) {
|
|
|
|
continue;
|
|
|
|
} else if (val == ':') {
|
|
|
|
// This is a missing argument.
|
|
|
|
val = OptionParser::GetOptionErrorCause();
|
|
|
|
missing_argument = true;
|
|
|
|
}
|
|
|
|
|
|
|
|
OptionSeen(val);
|
|
|
|
|
|
|
|
// Look up the long option index
|
|
|
|
if (long_options_index == -1) {
|
|
|
|
for (int j = 0; long_options[j].definition || long_options[j].flag ||
|
|
|
|
long_options[j].val;
|
|
|
|
++j) {
|
|
|
|
if (long_options[j].val == val) {
|
|
|
|
long_options_index = j;
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
// See if the option takes an argument, and see if one was supplied.
|
|
|
|
if (long_options_index >= 0) {
|
|
|
|
int opt_defs_index = -1;
|
|
|
|
for (size_t i = 0; i < opt_defs.size(); i++) {
|
|
|
|
if (opt_defs[i].short_option != val)
|
|
|
|
continue;
|
|
|
|
opt_defs_index = i;
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
|
|
|
|
const OptionDefinition *def = long_options[long_options_index].definition;
|
|
|
|
int has_arg =
|
|
|
|
(def == nullptr) ? OptionParser::eNoArgument : def->option_has_arg;
|
|
|
|
switch (has_arg) {
|
|
|
|
case OptionParser::eNoArgument:
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
opt_defs_index,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 1],
|
|
|
|
args),
|
|
|
|
0));
|
|
|
|
break;
|
|
|
|
case OptionParser::eRequiredArgument:
|
|
|
|
if (OptionParser::GetOptionArgument() != nullptr) {
|
|
|
|
int arg_index;
|
|
|
|
if (missing_argument)
|
|
|
|
arg_index = -1;
|
|
|
|
else
|
|
|
|
arg_index = OptionParser::GetOptionIndex() - 2;
|
|
|
|
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
opt_defs_index,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 2],
|
|
|
|
args),
|
|
|
|
arg_index));
|
|
|
|
} else {
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
opt_defs_index,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 1],
|
|
|
|
args),
|
|
|
|
-1));
|
|
|
|
}
|
|
|
|
break;
|
|
|
|
case OptionParser::eOptionalArgument:
|
|
|
|
if (OptionParser::GetOptionArgument() != nullptr) {
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
opt_defs_index,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 2],
|
|
|
|
args),
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 1],
|
|
|
|
args)));
|
|
|
|
} else {
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
opt_defs_index,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 2],
|
|
|
|
args),
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 1],
|
|
|
|
args)));
|
|
|
|
}
|
|
|
|
break;
|
|
|
|
default:
|
|
|
|
// The options table is messed up. Here we'll just continue
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
OptionArgElement::eUnrecognizedArg,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 1],
|
|
|
|
args),
|
|
|
|
OptionArgElement::eUnrecognizedArg));
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
option_element_vector.push_back(OptionArgElement(
|
|
|
|
OptionArgElement::eUnrecognizedArg,
|
|
|
|
FindOriginalIndex(dummy_vec[OptionParser::GetOptionIndex() - 1],
|
|
|
|
args),
|
|
|
|
OptionArgElement::eUnrecognizedArg));
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
// Finally we have to handle the case where the cursor index points at a
|
2018-05-01 00:49:04 +08:00
|
|
|
// single "-". We want to mark that in the option_element_vector, but only
|
|
|
|
// if it is not after the "--". But it turns out that OptionParser::Parse
|
|
|
|
// just ignores an isolated "-". So we have to look it up by hand here. We
|
|
|
|
// only care if it is AT the cursor position. Note, a single quoted dash is
|
|
|
|
// not the same as a single dash...
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
|
|
|
|
const Args::ArgEntry &cursor = args[cursor_index];
|
|
|
|
if ((static_cast<int32_t>(dash_dash_pos) == -1 ||
|
|
|
|
cursor_index < dash_dash_pos) &&
|
2019-09-13 19:26:48 +08:00
|
|
|
!cursor.IsQuoted() && cursor.ref() == "-") {
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
option_element_vector.push_back(
|
|
|
|
OptionArgElement(OptionArgElement::eBareDash, cursor_index,
|
|
|
|
OptionArgElement::eBareDash));
|
|
|
|
}
|
|
|
|
return option_element_vector;
|
|
|
|
}
|
|
|
|
|
|
|
|
llvm::Expected<Args> Options::Parse(const Args &args,
|
|
|
|
ExecutionContext *execution_context,
|
|
|
|
lldb::PlatformSP platform_sp,
|
|
|
|
bool require_validation) {
|
|
|
|
Status error;
|
|
|
|
Option *long_options = GetLongOptions();
|
|
|
|
if (long_options == nullptr) {
|
|
|
|
return llvm::make_error<llvm::StringError>("Invalid long options.",
|
|
|
|
llvm::inconvertibleErrorCode());
|
|
|
|
}
|
|
|
|
|
2019-07-11 01:09:47 +08:00
|
|
|
std::string short_options = BuildShortOptions(long_options);
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
std::vector<char *> argv = GetArgvForParsing(args);
|
|
|
|
std::unique_lock<std::mutex> lock;
|
|
|
|
OptionParser::Prepare(lock);
|
|
|
|
int val;
|
2019-05-24 08:44:33 +08:00
|
|
|
while (true) {
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
int long_options_index = -1;
|
2019-07-11 01:09:47 +08:00
|
|
|
val = OptionParser::Parse(argv, short_options, long_options,
|
|
|
|
&long_options_index);
|
2019-06-25 08:55:27 +08:00
|
|
|
|
2019-06-25 22:02:39 +08:00
|
|
|
if (val == ':') {
|
2020-07-21 13:57:06 +08:00
|
|
|
error.SetErrorString("last option requires an argument");
|
2019-06-25 08:55:27 +08:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
if (val == -1)
|
|
|
|
break;
|
|
|
|
|
|
|
|
// Did we get an error?
|
|
|
|
if (val == '?') {
|
2020-07-21 13:57:06 +08:00
|
|
|
error.SetErrorString("unknown or ambiguous option");
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
// The option auto-set itself
|
|
|
|
if (val == 0)
|
|
|
|
continue;
|
|
|
|
|
|
|
|
OptionSeen(val);
|
|
|
|
|
|
|
|
// Lookup the long option index
|
|
|
|
if (long_options_index == -1) {
|
|
|
|
for (int i = 0; long_options[i].definition || long_options[i].flag ||
|
|
|
|
long_options[i].val;
|
|
|
|
++i) {
|
|
|
|
if (long_options[i].val == val) {
|
|
|
|
long_options_index = i;
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
// Call the callback with the option
|
|
|
|
if (long_options_index >= 0 &&
|
|
|
|
long_options[long_options_index].definition) {
|
|
|
|
const OptionDefinition *def = long_options[long_options_index].definition;
|
|
|
|
|
|
|
|
if (!platform_sp) {
|
2018-05-01 00:49:04 +08:00
|
|
|
// User did not pass in an explicit platform. Try to grab from the
|
|
|
|
// execution context.
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
TargetSP target_sp =
|
|
|
|
execution_context ? execution_context->GetTargetSP() : TargetSP();
|
|
|
|
platform_sp = target_sp ? target_sp->GetPlatform() : PlatformSP();
|
|
|
|
}
|
|
|
|
OptionValidator *validator = def->validator;
|
|
|
|
|
|
|
|
if (!platform_sp && require_validation) {
|
2018-05-01 00:49:04 +08:00
|
|
|
// Caller requires validation but we cannot validate as we don't have
|
|
|
|
// the mandatory platform against which to validate.
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
return llvm::make_error<llvm::StringError>(
|
|
|
|
"cannot validate options: no platform available",
|
|
|
|
llvm::inconvertibleErrorCode());
|
|
|
|
}
|
|
|
|
|
|
|
|
bool validation_failed = false;
|
|
|
|
if (platform_sp) {
|
|
|
|
// Ensure we have an execution context, empty or not.
|
|
|
|
ExecutionContext dummy_context;
|
|
|
|
ExecutionContext *exe_ctx_p =
|
|
|
|
execution_context ? execution_context : &dummy_context;
|
|
|
|
if (validator && !validator->IsValid(*platform_sp, *exe_ctx_p)) {
|
|
|
|
validation_failed = true;
|
|
|
|
error.SetErrorStringWithFormat("Option \"%s\" invalid. %s",
|
|
|
|
def->long_option,
|
|
|
|
def->validator->LongConditionString());
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
// As long as validation didn't fail, we set the option value.
|
|
|
|
if (!validation_failed)
|
|
|
|
error =
|
|
|
|
SetOptionValue(long_options_index,
|
|
|
|
(def->option_has_arg == OptionParser::eNoArgument)
|
|
|
|
? nullptr
|
|
|
|
: OptionParser::GetOptionArgument(),
|
|
|
|
execution_context);
|
2019-10-04 06:18:51 +08:00
|
|
|
// If the Option setting returned an error, we should stop parsing
|
|
|
|
// and return the error.
|
|
|
|
if (error.Fail())
|
|
|
|
break;
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
} else {
|
|
|
|
error.SetErrorStringWithFormat("invalid option with value '%i'", val);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-02-26 22:50:40 +08:00
|
|
|
if (error.Fail())
|
|
|
|
return error.ToError();
|
|
|
|
|
2019-06-25 08:55:27 +08:00
|
|
|
argv.pop_back();
|
Move option parsing out of the Args class
Summary:
The args class is used in plenty of places (a lot of them in the lower lldb
layers) for representing a list of arguments, and most of these places don't
care about option parsing. Moving the option parsing out of the class removes
the largest external dependency (there are a couple more, but these are in
static functions), and brings us closer to being able to move it to the
Utility module).
The new home for these functions is the Options class, which was already used
as an argument to the parse calls, so this just inverts the dependency between
the two.
The functions are themselves are mainly just copied -- the biggest functional
change I've made to them is to avoid modifying the input Args argument (getopt
likes to permute the argument vector), as it was weird to have another class
reorder the entries in Args class. So now the functions don't modify the input
arguments, and (for those where it makes sense) return a new Args vector
instead. I've also made the addition of a "fake arg0" (required for getopt
compatibility) an implementation detail rather than a part of interface.
While doing that I noticed that ParseForCompletion function was recording the
option indexes in the shuffled vector, but then the consumer was looking up the
entries in the unshuffled one. This manifested itself as us not being able to
complete "watchpoint set variable foo --" (because getopt would move "foo" to
the end). Surprisingly all other completions (e.g. "watchpoint set variable foo
--w") were not affected by this. However, I couldn't find a comprehensive test
for command argument completion, so I consolidated the existing tests and added
a bunch of new ones.
Reviewers: davide, jingham, zturner
Subscribers: lldb-commits
Differential Revision: https://reviews.llvm.org/D43837
llvm-svn: 327110
2018-03-09 18:39:40 +08:00
|
|
|
argv.erase(argv.begin(), argv.begin() + OptionParser::GetOptionIndex());
|
|
|
|
return ReconstituteArgsAfterParsing(argv, args);
|
|
|
|
}
|