Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
tef
May 30, 2004

-> some l-system crap ->

SmirkingJack posted:

What is the standard for documenting your code? I get a kick out of javadoc/phpdoc. Is there a pythondoc or some equivalent way of generating HTML documentation of your API?

Docstings and pydoc.

http://www.diveintopython.org/getting_to_know_python/documenting_functions.html

http://docs.python.org/lib/module-pydoc.html

Adbot
ADBOT LOVES YOU

tef
May 30, 2004

-> some l-system crap ->

Bonus posted:

Does anyone know if there's any plan to implement proper closures in Python?

No. No plans. Guido hates lambda, and thinks that explicitly writing out functions is better.

quote:

code:
some_collection.each {|item| puts item}
Whereas in Python you'd do
code:
for item in some_collection: print item

This is not an example of iterators in the sense that python uses them. An interator in python is a simplified generator. Something that returns each element in order. This is an example of using closures to implement control flow structures. I think you meant iteration rather than iterators.

The same ruby code in python would be more like this:

code:
class some_collection_class(...):
  def each(self,x):
      for item in self.items:
          x()

def foo(item):
    print item

some_collection.each(foo)
So, instead of a simple loop we have to create a closure, capture the namespace, do a method lookup and finally do a function call for each loop iteration. Lovely.

I'm sure a smarter compiler might be able to remove some of this, but that requires some very clever typing.


quote:

Now notice that Ruby achieves all these three things with a single language construct

And a whole slew of helper methods. I would argue that it isn't a simple language construct, as you don't know the underlying control flow. You rely on the implementation of the helper methods and hope they match. Every class is free to re-implement the control flow as it sees fit.

It isn't necessarily a good thing to achieve three different structures with one, when they are different things.

quote:

But the usefulness is evident and it would be really nice to have closures for doing stuff like in the third example.

The usefulness is saving a few lines of typing out an explcit function. See the 'Guido hates lambda' again.

quote:

EDIT: Maybe something like
code:
click_counter = 0
button = Button('button_name') do:
  click_counter +=1
Internally, the block would be called with something similar to the yield statement.

How do you pass more than one closure? (Special cases are not popular in python)

Why would you call it yield or even similar to yield in python when yield is used to return a value, not yield control flow to a function. So, you would need a different name to the ruby construct.

Would you also implement hasblock in python too ? Or would you have special syntax for defining a function that accepts a block.

I'm not against closures per se, or against extending python's lambda construct to be more powerful, but it is hard to do so and maintain the spirit of python.

tef
May 30, 2004

-> some l-system crap ->
To make the distinction more clear:

In the python, the iterator either decides to return a value or not return a value.

In ruby, the iterator decides to run the body or exit. This makes it hard to specify what to do at the end of the values.

To give an example of python that is not so obviously translated to ruby.

code:
for x in xrange(0,6):
        print x 
else:
    print "finished"
The flow control decision is not made by the generator, but specified around it.

I am not sure how this would work with ruby, but I assume you could do something with exceptions, or introduce new syntax to pass another closure, but would this then be for all method calls? Would you have to introduce a new haselse operator too ?

In the end, although you can express one as the other, the implicit nature of ruby contains a lot of special cases, which I do not think is the right approach for python.

I'll admit I don't like thinking in terms of closure passing for control flow, and I really don't like the implicit passing either. It's a very personal objection.

Python already has it's own syntax for iteration and the with statement. The real problem is passing functions as arguments, so instead of trying to implement anonymous functions, I think we should try to name named functions less cumbersome.

As a suggestion to your syntax, I propose the following:

code:
click_counter = 0
button = Button('increase') do:
  def on_click():
    click_counter +=1
So, you could do something like:

code:
range_value = 0
toggle = Range('nitems') do:
  def on_up():
    range_value +=1
  def on_down():
    range_value +=1
These functions would be passed as keyword arguments.

So, things like button would be defined:

code:
def Button(name, on_click=null):
    ... 
Rationale:

You only have to name the function once.

By naming the functions passed you can have more than one, and it also allows you to pass these functions as named values, using the existing **args mechanism.

The new scope for the call means that it does not pollute the namespace.

The downside is that you cannot do it as part of a larger expression.

tef
May 30, 2004

-> some l-system crap ->

Lancer383 posted:

Does anyone here have experiencing working with the PyExcelerator python module or any other modules geared towards writing directly to Excel xls files?

I have used OLE with Excel to write and modify excel files. Run screaming Away.

Instead, have you considered writing to CSV files and then opening them in excel later?

tef
May 30, 2004

-> some l-system crap ->

Lancer383 posted:

I have considered CSV, but the problem is that we're looking to bundle all of the results up in one container, so to speak.

Instead of returning an excel sheet or something that can be opened directly within excel, could you distribute (in advance) an excel spreadsheet with a macro (and a button) for loading and creating the final excel version, and generate some sort of csv from the server end.

tef
May 30, 2004

-> some l-system crap ->
And, of course l=[x for x in "1234"]

Edit: I am a mong, fixed :)

tef fucked around with this message at 16:00 on Apr 3, 2008

tef
May 30, 2004

-> some l-system crap ->

duck monster posted:

Python was not designed as a teaching language. It was designed as a scripting language for Andrew Tannenbaum's Amoeba Operating system, which is where I first heard of it many many moons ago.

It was 'heavily influenced' by ABC, a language designed for teaching.

quote:

In 1986 I moved to a different project at CWI, the Amoeba project. Amoeba was a distributed operating system. By the late 1980s we found we needed a scripting language. I had a large degree of freedom on that project to start my own mini project within the scope of what we were doing. I remembered all my experience and some of my frustration with ABC. I decided to try to design a simple scripting language that possessed some of ABC's better properties, but without its problems.

So I started typing. I created a simple virtual machine, a simple parser, and a simple runtime. I made my own version of the various ABC parts that I liked. I created a basic syntax, used indentation for statement grouping instead of curly braces or begin-end blocks, and developed a small number of powerful data types: a hash table (or dictionary, as we call it), a list, strings, and numbers.

I took ABC's ingredients and shuffled them around a bit. Python was similar to ABC in many ways, but there were also differences. Python's lists, dictionaries, basic statements and use of indentation differed from what ABC had. ABC used uppercase for keywords. I never got comfortable with the uppercase, neither reading nor typing it, so in Python keywords were lowercase.

I think my most innovative contribution to Python's success was making it easy to extend. That also came out of my frustration with ABC. ABC was a very monolithic design. There was a language design team, and they were God. They designed every language detail and there was no way to add to it. You could write your own programs, but you couldn't easily add low-level stuff.

[...]

I already knew we would want to use Python on different platforms. I knew we wanted to use Python on Amoeba, the operating system we were developing, and on UNIX, the operating system we were using on our desktops.

Although it was written as part of the ameoba project, it seems he was considering more general goals very early on.

tef
May 30, 2004

-> some l-system crap ->

Scaevolus posted:

You can't concatenate strings by just doing a few pointer reassignments.

When you are concatenating strings a lot, Ropes make a good alternative. It seems to be in pypy too.

Although it makes concatenating cheap, I am curious as to how the hashing could be done equally fast.

tef
May 30, 2004

-> some l-system crap ->
I was hoping you could derive some hash such that

hash(concat(a,b)) = f(hash(a),hash(b))

for some magical f.

tef
May 30, 2004

-> some l-system crap ->

Sock on a Fish posted:

I tried that, but I can't find any way to get those error messages. stderr and stdout are both null.

I've not been having with Popen recently, maybe it is how you are calling it:

code:
import subprocess
p = subprocess.Popen(command,
                     stdin=subprocess.PIPE,
                     stdout=subprocess.PIPE,
                     stderr=subprocess.PIPE,
                     bufsize=1)

print p.stdout.readlines()
print p.stderr.readlines()

tef
May 30, 2004

-> some l-system crap ->

Sock on a Fish posted:

Score! I thought those extra arguments were the defaults and was just specifying my command. Thanks!

If you use subprocess to grab something that is a bit more than a few lines, you may find that it blocks when calling read() or readlines() on one of the outputs if the process is writing to the other.

I.e if you are waiting on stderr, and it is writing to stdout things can deadlock due to buffering.

The solution I used was

code:
import threading

class Buffer(threading.Thread):
    """This buffers the contents of a file object in a seperate thread. It
    is used to concurrently read from stdout and stderr at the same time"""
    def __init__ (self,file):
        threading.Thread.__init__(self)
        self.file=file
        self.buffer = None
    def run(self):
        self.buffer=self.file.readlines()


... some call to popen later ...

buffer=Buffer(stderr)
buffer.start()

print stdout.readlines()

buffer.join()

print buffer.buffer

tef
May 30, 2004

-> some l-system crap ->
I'm looking for a nice imaging/drawing api/toolkit for python.

The problem is that I need to produce real time (well, frequently updates) visualisations (mostly charts) on linux/python (although preferably cross platform)

I've been looking at using a number of chart APIs but have avoided them as they produce ugly as sin.

NodeBox is Pretty but it's OSX only. Processing is also pretty but in Java. I could use.

I could also use something like pyglet or even pyqt, and I'm looking into the possibility of pycairo.

Any suggestions ?

tef
May 30, 2004

-> some l-system crap ->

such a nice boy posted:

So...what feature would you most like added to Python?

where/with:

code:
x = foo(dave,3,5,toot) where:
    def dave(x):
        return x*2
    toot="cocks"

tef
May 30, 2004

-> some l-system crap ->

ATLbeer posted:

I would like some sort of central logging of all messages that have gone through the system for debugging and error checking since it's going to be transactional system.

Does the logging module not do what you want?

http://docs.python.org/lib/module-logging.html

Specifically the syslog module, or the httplog module:
http://docs.python.org/lib/node416.html
http://docs.python.org/lib/node420.html

tef
May 30, 2004

-> some l-system crap ->

agscala posted:

I looked all over the web and I couldn't find a solution. I emailed the creator of wcurses and haven't received a response... Any help is much appreciated for this beginner!

Extract the curses files into c:\python25\lib not c:\python25\lib\site-packages

tef
May 30, 2004

-> some l-system crap ->

Centipeed posted:

And I'm guessing inheriting from object gives me access to some functions that would otherwise not be available had I not done so? Can you give me any examples?

It means it is a new style class

http://www.python.org/doc/newstyle/

tef
May 30, 2004

-> some l-system crap ->

JoeNotCharles posted:

(You can't just quote the entire string to get it right no matter what's in the filename, you have to manually find and escape the !.)

code:
$ echo toot >  'a!b'
$ cat 'a!b'
toot
$
:shobon:

tef
May 30, 2004

-> some l-system crap ->

Stump Truck posted:

However, is there a way to give it the parameters through the terminal? I'm guessing no, since the code isn't telling it to wait for any input.

code:
$ cat > echo.py
import sys

for item in sys.argv:
        print item 

$ python echo.py hello 1 2 3 4
echo.py
hello
1
2
3
4

tef
May 30, 2004

-> some l-system crap ->

Stump Truck posted:

Do i enter that code in terminal or BBEdit?

In python, you can import the sys module - one of the things in the sys module is a list called argv.

This list contains all of the arguments passed into the program, including the name of the program.


So, if you were to write this in BBEdit

code:
import sys

for item in sys.argv:
        print item 
And save it as echo.py

You could run it from the terminal by typing: python echo.py

If you were to run it as python hello world

You would get
code:
$ python echo.py hello world
echo.py
hello
world
So, the first argument is argv[1], and so on.

argv is always a list of strings. If you want numbers, you will have to convert them using int() or float().

tef
May 30, 2004

-> some l-system crap ->

dense vegetation posted:

I've looked at a couple of Python books / resources, namely Mark Lutz - Learning Python, Swaroop C. H. - A Byte Of Python and I'm not satisfied. If anyone has ever read Kochan's Programming in C you'll know what I'm looking for. I find a lot of the stuff so far presented to me a bit too "scatty" and assuming prior knowledge of Java / C / C++, which, although I am very familiar with the C syntax, I don't want.

Although aimed at the beginner, Zelle's Python Programming: an introduction to computer science might be useful

tef
May 30, 2004

-> some l-system crap ->

LuckySevens posted:

code:
import time

today = time.localtime(time.time())
theDate = time.strftime("%A %B %d", today)
What I don't get exactly is the time. before it and the time.time() inside the function.

time is a module. you did import time at the top of your program.

Inside of the time module, there are a number of functions, including 'time', 'localtime' and 'strftime'

So, in your program 'time' means a reference to the time module, and 'time.time' means the time function inside of the time module


code:
>>> import time
>>> time 
<module 'time' from '/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/lib-dynload/time.so'>
>>> dir(time)
['__doc__', '__file__', '__name__', 'accept2dyear', 'altzone', 'asctime', 'clock', 'ctime', 'daylight', 'gmtime', 'localtime', 'mktime', 'sleep', 'strftime', 'strptime', 'struct_time', 'time', 'timezone', 'tzname', 'tzset']
>>> time.time()
1219910575.051717

tef
May 30, 2004

-> some l-system crap ->

ATLbeer posted:

Treading into a territory I've never been in and am having a hard time Googling up what I need here.

You could use pycurl to open the connection, and it takes a call back

tef
May 30, 2004

-> some l-system crap ->
Python + LDAP + Active Directory is making me cry.

I'm trying to get it to authenticate, and I am either being told the credentials are wrong, or that a successful bind must be completed.

This is some of the crap code I have cobbled together trying to get python-ldap to work.

code:
class ActiveDirectory:
    def __init__(self, url,  binddn, password, basedn):
        self.url = url
        self.binddn = binddn
        self.password = password
        self.basedn  = basedn
        
    def auth (self, username, password):
        l = ldap.initialize(self.url,0) 
        #l.set_option(ldap.OPT_PROTOCOL_VERSION, 3)
        
        print "binding %s %s"%(self.binddn,self.password)
        print l.simple_bind_s(self.binddn,self.password)
        
        print "bound"
    
        
        print "searching for %s"%username
        
        print "basedn=%s"%self.basedn
        
        result = l.search_ext_s(self.basedn,ldap.SCOPE_SUBTREE,"sAMAccountName=%s" % username,['dn'])[0][1]
        
        print result
        l.unbind_s()
                
        l.simple_bind_s(result['dn'][0], password)
        
        result = l.search_ext_s(self.basedn,ldap.SCOPE_SUBTREE,"sAMAccountName=%s" % username,['mail','givenName','sn','sAMAccountName'])[0][1]
        
        print result
        
        l_client.unbind_s()
So far I have tried it on linux, windows, with a full DN, user@domain.co.uk, and I can't seem to authenticate *enough* to be able to perform a search.

Help!


Edit:
AHAHAHAHAHAHAAH

code:
l.set_option(ldap.OPT_REFERRALS, 0)

tef fucked around with this message at 17:25 on Sep 26, 2008

tef
May 30, 2004

-> some l-system crap ->

bitprophet posted:

I'm imagining a world where they did what Apple did circa 2001,

Lets all run Xenix and AUX

tef
May 30, 2004

-> some l-system crap ->
I guess what I'm trying to say that both companies have dabbled in unix before.

tef
May 30, 2004

-> some l-system crap ->

JoeNotCharles posted:

Sounds really interesting - be sure to submit it to sn.printf.net so it shows up in my goon feed.

I added m0nk3ys blog to sn.printf.net, on the asumption he doesn't mind being associated with goons :shobon:

tef
May 30, 2004

-> some l-system crap ->
The command should be " /Library/Frameworks/Python.framework/Versions/2.5/bin/epydoc"

tef
May 30, 2004

-> some l-system crap ->
Yes, but:

Why efficient - strings are immutable in python, so often any character by character option won't be that efficient

Why not regular expressions - they're in the standard library ?

tef
May 30, 2004

-> some l-system crap ->
the problem is that you often don't know the filesystem a directory is using, and so even under unix you can be subject to windows filenames.

what's wrong with try/catch instead of trying to guess what might be valid?

tef
May 30, 2004

-> some l-system crap ->

dagard posted:

I'm missing something blindingly obvious, aren't i?

http://docs.python.org/tutorial/controlflow.html#unpacking-argument-lists

As mentioned already, you can use *foo to unpack the arguments.

What hasn't been mentioned is that you can use it in function definitions:

code:
def foo(*args):
    pass

Calling foo(1) sets args to [1], foo(1,2,3) sets args to [1,2,3], and so on.
There is an equivilent operator for dictionaries **.

tef
May 30, 2004

-> some l-system crap ->
OOP is fun and games but functional style programming in python is where the real fun is at.

Generators and list comprehensions are both awesome and powerful.

tef
May 30, 2004

-> some l-system crap ->
I really like being able to choose when to make things classes and when to make things functional style.

In the recent project at work, i've used a mixture of composition and inheritance in the objects, and functions for other parts.


Part of me wonders about prototypes in python :gay:

tef
May 30, 2004

-> some l-system crap ->

LuckySevens posted:

Of course, when inputting the dir I want to evaluate, it returns a perfect output. The only problem is, I don't quite understand exactly how this is workfore printing, but what is the path0, path1 values and where are they coming from? Such a noob :(

As mentioned earlier, sort takes an optional comparison function. In your case the comparison function compares the extentions. In the case below, sort uses reverse_cmp to compare numbers.

code:
def reverse_cmp(a,b):
    return cmp(a,b)

print sorted([1,9,3,5,6,7,3,2,3], cmp=reverse_cmp)
cmp is a function that takes two arguments and returns -1, 0, or 1 if a < b , a == b, or a > b respectively. sort takes an argument called cmp that it expects to behave in the same way.

However, it is often better to do a 'decorate-sort-undecorate' or a 'schwartzian-transform', than to have a complex comparison.

Instead of extracting the file extention for every comparison, we extract it once per filename and sort on that list instead.

For example, to sort a list of strings by their length, we 'decorate' it into a list of pairs:

code:
text = ['aaaaa','bbb','cccc','dd','e','fffff','gg']

tmp_text = []

for t in text:
    tmp_text.append( (len(t,t) )
tmp_text now contains [(5,'aaaaa'), (3,'bbb'),.....]

Then we sort by the first pair member:

code:
def cmp_first(a,b):
    return cmp(a[0],b[0])

print sorted(tmp_text, cmp=cmp_first)
Finally, we just 'undecorate' the list to remove the lengths (no code for this one, i'm lazy)

This is such a common idiom - preprocessing the keys and sorting by them - that sort takes an optional argument called key that takes a function to do this.

code:
print sorted(['aaaaa','bbb','cccc','dd','e','fffff','gg'], key=len)
So, for your original example, here is sort using the file extention as the key to sort on:

code:
import os.path

def get_file_ext(filename):
    return os.path.splitext(filename)[1]

print sorted(['foo.txt','bar.pdf','baz.doc'], key=get_file_ext)
I would advocate using the latter over the example you gave :)

tef fucked around with this message at 11:10 on Oct 28, 2008

tef
May 30, 2004

-> some l-system crap ->
Add c:\python26 to the environment variable PATH ?

tef
May 30, 2004

-> some l-system crap ->
Often if you have some mutable data structure, which have a number of functions that operate on it, it's easier to make it a class.

I would also advocate reading through the python tutorial.

http://docs.python.org/tutorial/index.html

tef
May 30, 2004

-> some l-system crap ->
Edit: Should refresh before posting.

tef
May 30, 2004

-> some l-system crap ->

mr noller posted:

Imagine a concurrent package for python: in it you have your basics - threading, multiprocessing (maybe "acting" too, but that name sucks,
how about greenprocesses? :)) and you also have utilities like
multiprocessing.pool, multiprocessing.manager/etc,
threading.producer_consumer, etc. We build these out and make them "as
safe as possible".

Aside, light weight processes are often called Fibers.

quote:

So, while Erlang's syntax makes me about as excited as punching myself in the face, I'd rather know more about the Actor system as implemented in it.

There is more to erlang than just the message passing concurrency though, and I would really suggest looking at it in depth for some of the other details. OTP is really geared towards making robust applications.

One of the things is that when a process dies, it sends a termination message to it's parents - that if uncaught, forces them to die too. If nothing matches in a case expression, an error is thrown too.

There are also inherent limitations to erlangs model too. There is only one 'inbox' per process and no way to seperate out messages in advance. This means if you are not careful it is a linear scan to find any high priority messages first.

See this post for an example: http://www.lshift.net/blog/2007/10/01/too-much-mail-is-bad-for-you

quote:

Just to add: One of the big drawbacks to the multiprocessing package is IPC - the serialization/de-serialization costs can harm you if you're passing crap-tons of objects, not to mention the basic requirement of pickle-ability. Erlang's actors are not independent processes.

It is not just because erlang is independant, but because it is also pure that allows it's message passing to be so cheap. They don't have to copy any data or maintain synchronization.

Singularity manages to make message passing cheap by enforcing uniqueness. Only one process can use shared memory at a time, and when anything is sent the original process loses ownership.

Although erlang is flavour of the month, there are a couple of other interesting approaches to concurrency, notably the join calculus

tef
May 30, 2004

-> some l-system crap ->
use expect + ssh.

alternatively, use the subprocess module to launch ssh, and manipulate it

tef
May 30, 2004

-> some l-system crap ->

chemosh6969 posted:

code:
proc = subprocess.Popen('f:\\putty username@server -pw password',
                        shell=True,
                        stdin=subprocess.PIPE,
                        )
proc.communicate('2\n')

Try:

code:
import subprocess
proc = subprocess.Popen(['[b]f:\\plink[/b]','username@server','-pw','password'],
                        shell=True,
                        stdin=subprocess.PIPE,
                        stdout=subprocess.PIPE,
                        )
proc.stdin.write('2\n')
print proc.stdout.read()
?

You have to specify PIPE for each file handle you wish to control, otherwise it inherits the parent's by default.

Putty is a GUI program, and not a terminal program. As such it is hard to control the input and output. What you want instead is plink or ssh which are console/terminal programs.

Plink is part of the putty distribution, and is documented here: http://the.earth.li/~sgtatham/putty/0.60/htmldoc/Chapter7.html#plink

Adbot
ADBOT LOVES YOU

tef
May 30, 2004

-> some l-system crap ->

Habnabit posted:

You can't use shell=True if you're passing a list of parameters

Didn't realise that, ta.

  • Locked thread