This is a big one, touching lots of files. Some of the platforms
aren't tested yet. Briefly, this changes the return value of the
os/posix functions stat(), fstat(), statvfs(), fstatvfs(), and the
time functions localtime(), gmtime(), and strptime() from tuples into
pseudo-sequences. When accessed as a sequence, they behave exactly as
before. But they also have attributes like st_mtime or tm_year. The
stat return value, moreover, has a few platform-specific attributes
that are not available through the sequence interface (because
everybody expects the sequence to have a fixed length, these couldn't
be added there). If your platform's struct stat doesn't define
st_blksize, st_blocks or st_rdev, they won't be accessible from Python
either.
(Still missing is a documentation update.)
The GUI-mode code to display properties blew up if the property functions
(get, set, etc) weren't simply methods (or functions).
"The problem" here is really that the generic document() method dispatches
to one of .doc{routine, class, module, other}(), but all of those require
a different(!) number of arguments. Thus document isn't general-purpose
at all: you have to know exactly what kind of thing is it you're going
to document first, in order to pass the correct number of arguments to
.document for it to pass on. As an expedient hack, just tacked "*ignored"
on to the end of the formal argument lists for the .docXXX routines so
that .document's caller doesn't have to know in advance which path
.document is going to take.
(With slight cosmetic improvements to shorten lines and a grammar fix
to a docstring.)
This addes -X and -E options to freeze. From the docstring:
-X module Like -x, except the module can never be imported by
the frozen binary.
-E: Freeze will fail if any modules can't be found (that
were not excluded using -x or -X).
This fixes the behavior reported by SF bug #404545, where a file
x.y.py could be imported by the statement "import x.y" when there's a
frozen package x (I believe even if x.y also exists as a frozen
module).
:-).
Add a test that prevents the __hello__ bytecode from going stale
unnoticed again.
The test also tests the loophole noted in SF bug #404545. This test
will fail right now; I'll check in the fix in a minute.
the left-hand operand may not be the proxy in all cases. If it isn't,
we end up doing two things: a) unwrapping something that isn't a
PyWeakReference (later resulting in a core dump) and b) passing a
proxy as the right-hand operand anyway, even though that can't be
handled by the actual handler (maybe eventually causing a core dump).
This is fixed by always unwrapping all the proxies involved before
passing anything to the actual handler.
The symbol table pass didn't have an explicit case for the list_iter
node which is used only for a nested list comprehension. As a result,
the target of the list comprehension was treated as a use instead of
an assignment. Fix is to add a case to symtable_node() to handle
list_iter.
Also, rework and document a couple of the subtler implementation
issues in the symbol table pass. The symtable_node() switch statement
depends on falling through the last several cases, in order to handle
some of the more complicated nodes like atom. Add a comment
explaining the behavior before the first fall through case. Add a
comment /* fall through */ at the end of case so that it is explicitly
marked as such.
Move the for_stmt case out of the fall through logic, which simplifies
both for_stmt and default. (The default used the local variable start
to skip the first three nodes of a for_stmt when it fell through.)
Rename the flag argument to symtable_assign() to def_flag and add a
comment explaining its use:
The third argument to symatble_assign() is a flag to be passed to
symtable_add_def() if it is eventually called. The flag is useful
to specify the particular type of assignment that should be
recorded, e.g. an assignment caused by import.
isinstance() now allows any object as the first argument and a class, a
type or something with a __bases__ tuple attribute for the second
argument. This closes SF patch #464992.
Quoth the OpenSSL RAND_add man page:
OpenSSL makes sure that the PRNG state is unique for each
thread. On systems that provide /dev/urandom, the
randomness device is used to seed the PRNG transparently.
However, on all other systems, the application is
responsible for seeding the PRNG by calling RAND_add(),
RAND_egd(3) or RAND_load_file(3).
I decided to expose RAND_add() because it's general and RAND_egd()
because it's a useful special case. RAND_load_file() didn't seem to
offer much over RAND_add(), so I skipped it. Also supplied
RAND_status() which returns true if the PRNG is seeded and false if
not.
test_no_semis_header_splitter(): This actually should still split.
test_no_split_long_header(): An example of an unsplittable line.
test_no_semis_header_splitter(): Test for SF bug # 471918, Generator
splitting long headers.
_split_header(): Split on folding whitespace if the attempt to split
on semi-colons failed.
_split_header(): Patch by Matthew Cowles for fixing SF bug # 471918,
Generator splitting long headers.
There are now no known cases where the compiler package computes a
stack depth lower than the one computed by the builtin compiler. (To
achieve this state, we had to fix bugs in both compilers :-).
The chief change is to do the depth calculations with respect to basic
blocks. The stack effect of block is calculated. Then the flow graph
is traversed using breadth-first search to find the max weight path
through the graph.
Had to fix the StackDepthTracker to calculate the right info for
several opcodes: LOAD_ATTR, CALL_FUNCTION (and friends), MAKE_CLOSURE,
and DUP_TOPX.
XXX Still need to handle free variables in MAKE_CLOSURE.
XXX There are still a lot of places where the computed stack depth is
larger than for the builtin compiler. These won't cause the
interpreter to overflow the frame, but they waste space.
Also minor tweaks to internal routines.
Use PyCF_MASK instead of explicit list of flags.
For the MAKE_CLOSURE opcode, the number of items popped off the stack
depends on both the oparg and the number of free variables for the
code object. Fix the code so it accounts for the free variables.
In com_classdef(), record an extra pop to account for the STORE call
after the BUILD_CLASS.
Get rid of some commented out debugging code in com_push() and
com_pop().
Factor string resize logic into helper routine com_check_size().
In com_addbyte(), remove redudant if statement after assert. (They
test the same condition.)
In several routines, use string macros instead of string functions.