The charming adventures of an analyst and his solver.
How to capture standard output when running a child process in Python.
Lately at work I've been putting together a web management console that handles execution of some of our long running Gurobi models. This makes a lot of sense to do since the models run on Gurobi Cloud for hours or even days before achieving their desired optimality gaps. Gurobi Cloud requires a persistent connection, so we use a server to handle that instead of solving models from our desktop machines. In a sense, this is a bit of a throwback to the time sharing systems from before the PC era.
As a result, I've had to update my dormant (and, to be honest, never particularly acute) administration and systems programming skills. There are always some interesting gotchas that come up in this sort of work. This time I ran into the following conundrum: I have a cron job that is responsible for constructing model instances in a child process, handing them off to Gurobi Cloud, and saving their results. Its output is redirected to a file so I can tail it in real time and store it for later. For the web console, I also want to store the output to a database so I can easily display it. And I'd like to do this in as generic a manner as possible, so I can add other types of models (custom heuristics, for instance) and also capture anything they write to standard output and standard error.
There are a number of threads available on capturing standard output. Some of the advice I found recommended using os.dup2 to clone the stdout file descriptor into another file. The problem with this approach is it requires having an open file, which is inconvenient, and it ends up closing the existing stdout. So the output does get redirected, but then it disappears from my log. Also, I could only seem to capture it once. I wanted to do this repeatedly.
Other advice got me closer to the mark: Replace sys.stdout with an instance of StringIO, then just read its contents later. This allowed me to do redirections in serial, but once again meant that standard output was essentially shut off.
The final solution was to subclass StringIO and override its write method. Every time write is called, it writes to both the in-memory string and a canonical reference to stdout. Example code is given below that captures stdout and stderr from a child process while still letting their messages be visible.
from StringIO import StringIO from multiprocessing import Process, Queue import sys # Proxy classes that both write to and capture strings # sent to stdout or stderr, respectively. class TeeOut(StringIO): def write(self, s): StringIO.write(self, s) sys.__stdout__.write(s) class TeeErr(StringIO): def write(self, s): StringIO.write(self, s) sys.__stderr__.write(s) class Parent(object): def run(self): # Start child process. queue = Queue() child = Process(target=self.spawn, args=(queue,)) child.start() child.join() # Do someting with out and err... out = queue.get() err = queue.get() def spawn(self, queue): # Save everything that would otherwise go to stdout. out = TeeOut() sys.stdout = out err = TeeErr() sys.stderr = err try: # Do something... print 'out 1' print 'out 2' sys.stderr.write('error 1\n') print 'out 3' sys.stderr.write('error 2\n') finally: # Restore stdout, stderr and send their contents. sys.stdout = sys.__stdout__ sys.stderr = sys.__stderr__ queue.put(out.getvalue()) queue.put(err.getvalue()) if __name__ == '__main__': Parent().run()
A few minor points: