PyObject_CallFunction returns a PyObject which needs to be
decref'ed when it is no longer needed.
Patch by David Luyer
Differential Revision: https://reviews.llvm.org/D33740
llvm-svn: 305873
This:
a) teaches PythonCallable to look inside a callable object
b) teaches PythonCallable to discover whether a callable method is bound
c) teaches lldb.command to dispatch to either the older 4 argument version or the newer 5 argument version
llvm-svn: 273640
This finishes the effort to port python-wrapper.swig code over to
using PythonDataObjects.
Also included in this patch is the removal of `PyCallable` from
`python-wrapper.swig`, as it is no longer used after having been
replaced by `PythonCallable` everywhere.
There might be additional cleanup as followup patches, but it should
be all fairly simple and minor.
llvm-svn: 252939
PyCallable is a class that exists solely within the swig wrapper
code. PythonCallable is a more generic implementation of the same
idea that can be used by any Python-related interop code, and lives
in PythonDataObjects.h
The CL is mostly mechanical, and it doesn't cover every possible
user of PyCallable, because I want to minimize the impact of this
change (as well as making it easier to figure out what went wrong
in case this causes a failure). I plan to finish up the rest of
the changes in a subsequent patch, culminating in the removal of
PyCallable entirely.
llvm-svn: 252906
This had been relegated to a simple forwarding function, so just
delete it in preparation of migrating all of these functions out
of python-wrapper.swig.
llvm-svn: 252803
This only begins to port python-wrapper.swig over. Since this
code can be pretty hairy, I plan to do this incrementally over a
series of patches, each time removing or converting more code
over to the PythonDataObjects code.
llvm-svn: 252788
Relying on manual Python C API calls is error prone, especially
when trying to maintain compatibility with Python 2 and Python 3.
This patch additionally fixes what appears to be a potentially
serious memory leak, in that were were incref'ing two values
returned from the session dictionary but never decref'ing them.
There was a comment indicating that it was intentional, but the
reasoning was, I believe, faulty and it resulted in a legitimate
memory leak.
Switching everything to PythonObject based classes solves both
the compatibility issues as well as the resource leak issues.
llvm-svn: 252536
Summary:
Along with this, support for an optional argument to the "num_children"
method of a Python synthetic child provider has also been added. These have
been added with the following use case in mind:
Synthetic child providers currently have a method "has_children" and
"num_children". While the former is good enough to know if there are
children, it does not give any insight into how many children there are.
Though the latter serves this purpose, calculating the number for children
of a data structure could be an O(N) operation if the data structure has N
children. The new method added in this change provide a middle ground.
One can call GetNumChildren(K) to know if a child exists at an index K
which can be as large as the callers tolerance can be. If the caller wants
to know about children beyond K, it can make an other call with 2K. If the
synthetic child provider maintains state about it counting till K
previosly, then the next call is only an O(K) operation. Infact, all
calls made progressively with steps of K will be O(K) operations.
Reviewers: vharron, clayborg, granata.enrico
Subscribers: labath, lldb-commits
Differential Revision: http://reviews.llvm.org/D13778
llvm-svn: 250930
The order of libraries passed to the linker didn't work under linux (you
need the llvm libraries first, then the lldb libraries). I modelled this
after clang's setup here. Seemed simple enough to just be consistent.
llvm-svn: 232461
Summary:
Also, change its return type to size_t to match the return types of
its callers.
With this change, std::vector and std::list data formatter tests
pass on Linux (when using libstdc++) with clang as well as with gcc.
These tests have also been enabled in this patch.
Test Plan: dotest.py -p <TestDataFormatterStdVector|TestDataFormatterStdList>
Reviewers: vharron, clayborg
Reviewed By: clayborg
Subscribers: zturner, lldb-commits
Differential Revision: http://reviews.llvm.org/D8337
llvm-svn: 232399
This works by creating a command backed by a class whose interface should - at least - include
def __init__(self, debugger, session_dict)
def __call__(self, args, return_obj, exe_ctx)
What works:
- adding a command via command script add --class
- calling a thusly created command
What is missing:
- support for custom help
- test cases
The missing parts will follow over the next couple of days
This is an improvement over the existing system as:
a) it provides an obvious location for commands to provide help strings (i.e. methods)
b) it allows commands to store state in an obvious fashion
c) it allows us to easily add features to script commands over time (option parsing and subcommands registration, I am looking at you :-)
llvm-svn: 232136
This works similarly to the {thread/frame/process/target.script:...} feature - you write a summary string, part of which is
${var.script:someFuncName}
someFuncName is expected to be declared as
def someFuncName(SBValue,otherArgument) - essentially the same as a summary function
Since . -> [] are the only allowed separators, and % is used for custom formatting, .script: would not be a legitimate symbol anyway, which makes this non-ambiguous
llvm-svn: 220821
The way to do this is to write a synthetic child provider for your type, and have it vend the (optional) get_value function.
If get_value is defined, and it returns a valid SBValue, that SBValue's value (as in lldb_private::Value) will be used as the synthetic ValueObject's Value
The rationale for doing things this way is twofold:
- there are many possible ways to define a "value" (SBData, a Python number, ...) but SBValue seems general enough as a thing that stores a "value", so we just trade values that way and that keeps our currency trivial
- we could introduce a new level of layering (ValueObjectSyntheticValue), a new kind of formatter (synthetic value producer), but that would complicate the model (can I have a dynamic with no synthetic children but synthetic value? synthetic value with synthetic children but no dynamic?), and I really couldn't see much benefit to be reaped from this added complexity in the matrix
On the other hand, just defining a synthetic child provider with a get_value but returning no actual children is easy enough that it's not a significant road-block to adoption of this feature
Comes with a test case
llvm-svn: 219330
the user level. It adds the ability to invent new stepping modes implemented by python classes,
and to view the current thread plan stack and to some extent alter it.
I haven't gotten to documentation or tests yet. But this should not cause any behavior changes
if you don't use it, so its safe to check it in now and work on it incrementally.
llvm-svn: 218642
PyTuple_SetItem steals a reference to the item it inserts in the tuple
This, plus the Py_XDECREF of the tuple a few lines below, causes our session dictionary to go away after the first time a SWIG layer function is called - with disastrous effects for the first subsequent attempt to use any functionality in ScriptInterpreterPython
This fixes it
llvm-svn: 200429
The many many benefits include:
1 - Input/Output/Error streams are now handled as real streams not a push style input
2 - auto completion in python embedded interpreter
3 - multi-line input for "script" and "expression" commands now allow you to edit previous/next lines using up and down arrow keys and this makes multi-line input actually a viable thing to use
4 - it is now possible to use curses to drive LLDB (please try the "gui" command)
We will need to deal with and fix any buildbot failures and tests and arise now that input/output and error are correctly hooked up in all cases.
llvm-svn: 200263
pure virtual base class and made StackFrame a subclass of that. As
I started to build on top of that arrangement today, I found that it
wasn't working out like I intended. Instead I'll try sticking with
the single StackFrame class -- there's too much code duplication to
make a more complicated class hierarchy sensible I think.
llvm-svn: 193983
defines a protocol that all subclasses will implement. StackFrame
is currently the only subclass and the methods that Frame vends are
nearly identical to StackFrame's old methods.
Update all callers to use Frame*/Frame& instead of pointers to
StackFrames.
This is almost entirely a mechanical change that touches a lot of
the code base so I'm committing it alone. No new functionality is
added with this patch, no new subclasses of Frame exist yet.
I'll probably need to tweak some of the separation, possibly moving
some of StackFrame's methods up in to Frame, but this is a good
starting point.
<rdar://problem/15314068>
llvm-svn: 193907
When debugging with the GDB remote in LLDB, LLDB uses special packets to discover the
registers on the remote server. When those packets aren't supported, LLDB doesn't
know what the registers look like. This checkin implements a setting that can be used
to specify a python file that contains the registers definitions. The setting is:
(lldb) settings set plugin.process.gdb-remote.target-definition-file /path/to/module.py
Inside module there should be a function:
def get_dynamic_setting(target, setting_name):
This dynamic setting function is handed the "target" which is a SBTarget, and the
"setting_name", which is the name of the dynamic setting to retrieve. For the GDB
remote target definition the setting name is 'gdb-server-target-definition'. The
return value is a dictionary that follows the same format as the OperatingSystem
plugins follow. I have checked in an example file that implements the x86_64 GDB
register set for people to see:
examples/python/x86_64_target_definition.py
This allows LLDB to debug to any archticture that is support and allows users to
define the registers contexts when the discovery packets (qRegisterInfo, qHostInfo)
are not supported by the remote GDB server.
A few benefits of doing this in Python:
1 - The dynamic register context was already supported in the OperatingSystem plug-in
2 - Register contexts can use all of the LLDB enumerations and definitions for things
like lldb::Format, lldb::Encoding, generic register numbers, invalid registers
numbers, etc.
3 - The code that generates the register context can use the program to calculate the
register context contents (like offsets, register numbers, and more)
4 - True dynamic detection could be used where variables and types could be read from
the target program itself in order to determine which registers are available since
the target is passed into the python function.
This is designed to be used instead of XML since it is more dynamic and code flow and
functions can be used to make the dictionary.
llvm-svn: 192646
This is implemented by means of a get_dynamic_setting(target, setting_name) function vended by the Python module, which can respond to arbitrary string names with dynamically constructed
settings objects (most likely, some of those that PythonDataObjects supports) for LLDB to parse
This needs to be hooked up to the debugger via some setting to allow users to specify which module will vend the information they want to supply
llvm-svn: 192628
Summary:
This merge brings in the improved 'platform' command that knows how to
interface with remote machines; that is, query OS/kernel information, push
and pull files, run shell commands, etc... and implementation for the new
communication packets that back that interface, at least on Darwin based
operating systems via the POSIXPlatform class. Linux support is coming soon.
Verified the test suite runs cleanly on Linux (x86_64), build OK on Mac OS
X Mountain Lion.
Additional improvements (not in the source SVN branch 'lldb-platform-work'):
- cmake build scripts for lldb-platform
- cleanup test suite
- documentation stub for qPlatform_RunCommand
- use log class instead of printf() directly
- reverted work-in-progress-looking changes from test/types/TestAbstract.py that work towards running the test suite remotely.
- add new logging category 'platform'
Reviewers: Matt Kopec, Greg Clayton
Review: http://llvm-reviews.chandlerc.com/D1493
llvm-svn: 189295
OS Plugins' __init__ method takes two arguments: (self,process)
I was erroneously passing the session_dict as well as part of my PyCallable changes and that caused plugins to fail to work
llvm-svn: 185240
The semi-unofficial way of returning a status from a Python command was to return a string (e.g. return "no such variable was found") that LLDB would pick as a clue of an error having happened
This checkin changes that:
- SBCommandReturnObject now exports a SetError() call, which can take an SBError or a plain C-string
- script commands now drop any return value and expect the SBCommandReturnObject ("return object") to be filled in appropriately - if you do nothing, a success will be assumed
If your commands were relying on returning a value and having LLDB pick that up as an error, please change your commands to SetError() through the return object or expect changes in behavior
llvm-svn: 184893
Now, the way SWIG wrappers call into Python is through a utility PyCallable object, which overloads operator () to look like a normal function call
Plus, using the SBTypeToSWIGWrapper() family of functions, we can call python functions transparently as if they were plain C functions
Using this new technique should make adding new Python call points easier and quicker
The PyCallable is a generally useful facility, and we might want to consider moving it to a separate layer where other parts of LLDB can use it
llvm-svn: 184608
Any time a SWIG wrapper needs a PyObject for an SB object, it now should call into SBTypeToSWIGWrapper<SBType>(SBType*)
If you try to use it on an SBType for which there is not an implementation yet, LLDB will fail to link - just add your specialization to python-swigsafecast.swig and rebuild
This is the first step in simplifying our SWIG Wrapper layer
llvm-svn: 184580
Specifically, the ${target ${process ${thread and ${frame specifiers have been extended to allow a subkeyword .script:<fctName> (e.g. ${frame.script:FooFunction})
The functions are prototyped as
def FooFunction(Object,unused)
where object is of the respective SB-type (SBTarget for target.script, ... and so on)
This has not been implemented for ${var because it would be akin to a Python summary which is already well-defined in LLDB
llvm-svn: 184500
Allowing LLDB to resolve names of Python functions when they are located in classes
This allows things like *bound* classmethods to be used for formatters, commands, ...
llvm-svn: 183772
Upon encountering an object not of type string, LLDB will get the string representation of it (akin to calling str(X) in Python code) and use that as the summary to display
Feedback is welcome as to whether repr() should be used instead (but the argument for repr() better be highly persuasive :-)
llvm-svn: 182953
Python breakpoint actions can return False to say that they don't want to stop at the breakpoint to which they are associated
Almost all of the work to support this notion of a breakpoint callback was in place, but two small moving parts were missing:
a) the SWIG wrapper was not checking the return value of the script
b) when passing a Python function by name, the call statement was dropping the return value of the function
This checkin addresses both concerns and makes this work
Care has been taken that you only keep running when an actual value of False has been returned, and that any other value (None included) means Stop!
llvm-svn: 181866
This commit enables the new HasChildren() feature for synthetic children providers
Namely, it hooks up the required bits and pieces so that individual synthetic children providers can implement a new (optional) has_children call
Default implementations have been provided where necessary so that any existing providers continue to work and behave correctly
Next steps are:
2) writing smart implementations of has_children for our providers whenever possible
3) make a test case
llvm-svn: 166495
Given our implementation of ValueObjects we could have a scenario where a ValueObject has a dynamic type of Foo* at one point, and then its dynamic type changes to Bar*
If Bar* has synthetic children enabled, by the time we figure that out, our public API is already vending SBValues wrapping a DynamicVO, instead of a SyntheticVO and there was
no trivial way for us to change the SP inside an SBValue on the fly
This checkin reimplements SBValue in terms of a wrapper, ValueImpl, that allows this substitutions on-the-fly by overriding GetSP() to do The Right Thing (TM)
As an additional bonus, GetNonSyntheticValue() now works, and we can get rid of the ForceDisableSyntheticChildren idiom in ScriptInterpreterPython
Lastly, this checkin makes sure the synthetic VOs get the correct m_value and m_data from their parents (prevented summaries from working in some cases)
llvm-svn: 166426
- Tweaked a parameter name in SBDebugger.h so my typemap will catch it;
- Added a SBDebugger.Create(bool, callback, baton) to the swig interface;
- Added SBDebugger.SetLoggingCallback to the swig interface;
- Added a callback utility function for log callbacks;
- Guard against Py_None on both callback utility functions;
- Added a FIXME to the SBDebugger API test;
- Added a __del__() stub for SBDebugger.
We need to be able to get both the log callback and baton from an
SBDebugger if we want to protect against memory leaks (or make the user
responsible for holding another reference to the callback).
Additionally, it's impossible to revert from a callback-backed log
mechanism to a file-backed log mechanism.
llvm-svn: 162633