Warning

This documentation is for an old version of IPython. You can find docs for newer versions here.

Module: parallel.apps.launcher

Facilities for launching IPython processes asynchronously.

42 Classes

class IPython.parallel.apps.launcher.LauncherError

Bases: Exception

class IPython.parallel.apps.launcher.ProcessStateError

Bases: IPython.parallel.apps.launcher.LauncherError

class IPython.parallel.apps.launcher.UnknownStatus

Bases: IPython.parallel.apps.launcher.LauncherError

class IPython.parallel.apps.launcher.BaseLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.config.configurable.LoggingConfigurable

An asbtraction for starting, stopping and signaling a process.

__init__(work_dir='.', config=None, **kwargs)
arg_str

The string form of the program arguments.

args

A list of cmd and args that will be used to start the process.

This is what is passed to spawnProcess() and the first element will be the process name.

find_args()

The .args property calls this to find the args list.

Subcommand should implement this to construct the cmd and args.

notify_start(data)

Call this to trigger startup actions.

This logs the process startup and sets the state to ‘running’. It is a pass-through so it can be used as a callback.

notify_stop(data)

Call this to trigger process stop actions.

This logs the process stopping and sets the state to ‘after’. Call this to trigger callbacks registered via on_stop().

on_stop(f)

Register a callback to be called with this Launcher’s stop_data when the process actually finishes.

running

Am I running.

signal(sig)

Signal the process.

Parameters:

sig : str or int

‘KILL’, ‘INT’, etc., or any signal number

start()

Start the process.

stop()

Stop the process and notify observers of stopping.

This method will return None immediately. To observe the actual process stopping, see on_stop().

class IPython.parallel.apps.launcher.ClusterAppMixin(*args, **kw)

Bases: IPython.utils.traitlets.HasTraits

MixIn for cluster args as traits

class IPython.parallel.apps.launcher.ControllerMixin(*args, **kw)

Bases: IPython.parallel.apps.launcher.ClusterAppMixin

class IPython.parallel.apps.launcher.EngineMixin(*args, **kw)

Bases: IPython.parallel.apps.launcher.ClusterAppMixin

class IPython.parallel.apps.launcher.LocalProcessLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.BaseLauncher

Start and stop an external process in an asynchronous manner.

This will launch the external process with a working directory of self.work_dir.

__init__(work_dir='.', config=None, **kwargs)
interrupt_then_kill(delay=2.0)

Send INT, wait a delay and then send KILL.

class IPython.parallel.apps.launcher.LocalControllerLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LocalProcessLauncher, IPython.parallel.apps.launcher.ControllerMixin

Launch a controller as a regular external process.

start()

Start the controller by profile_dir.

class IPython.parallel.apps.launcher.LocalEngineLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LocalProcessLauncher, IPython.parallel.apps.launcher.EngineMixin

Launch a single engine as a regular externall process.

class IPython.parallel.apps.launcher.LocalEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LocalEngineLauncher

Launch a set of engines as regular external processes.

__init__(work_dir='.', config=None, **kwargs)
launcher_class

alias of LocalEngineLauncher

start(n)

Start n engines by profile or profile_dir.

class IPython.parallel.apps.launcher.MPILauncher(*args, **kwargs)

Bases: IPython.parallel.apps.launcher.LocalProcessLauncher

Launch an external process using mpiexec.

__init__(*args, **kwargs)
find_args()

Build self.args using all the fields.

start(n)

Start n instances of the program using mpiexec.

class IPython.parallel.apps.launcher.MPIControllerLauncher(*args, **kwargs)

Bases: IPython.parallel.apps.launcher.MPILauncher, IPython.parallel.apps.launcher.ControllerMixin

Launch a controller using mpiexec.

start()

Start the controller by profile_dir.

class IPython.parallel.apps.launcher.MPIEngineSetLauncher(*args, **kwargs)

Bases: IPython.parallel.apps.launcher.MPILauncher, IPython.parallel.apps.launcher.EngineMixin

Launch engines using mpiexec

start(n)

Start n engines by profile or profile_dir.

class IPython.parallel.apps.launcher.DeprecatedMPILauncher

Bases: object

class IPython.parallel.apps.launcher.MPIExecLauncher(*args, **kwargs)

Bases: IPython.parallel.apps.launcher.MPILauncher, IPython.parallel.apps.launcher.DeprecatedMPILauncher

Deprecated, use MPILauncher

__init__(*args, **kwargs)
class IPython.parallel.apps.launcher.MPIExecControllerLauncher(*args, **kwargs)

Bases: IPython.parallel.apps.launcher.MPIControllerLauncher, IPython.parallel.apps.launcher.DeprecatedMPILauncher

Deprecated, use MPIControllerLauncher

__init__(*args, **kwargs)
class IPython.parallel.apps.launcher.MPIExecEngineSetLauncher(*args, **kwargs)

Bases: IPython.parallel.apps.launcher.MPIEngineSetLauncher, IPython.parallel.apps.launcher.DeprecatedMPILauncher

Deprecated, use MPIEngineSetLauncher

__init__(*args, **kwargs)
class IPython.parallel.apps.launcher.SSHLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LocalProcessLauncher

A minimal launcher for ssh.

To be useful this will probably have to be extended to use the sshx idea for environment variables. There could be other things this needs as well.

fetch_files()

fetch remote files (called after start)

send_files()

send our files (called before start)

class IPython.parallel.apps.launcher.SSHClusterLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.SSHLauncher, IPython.parallel.apps.launcher.ClusterAppMixin

class IPython.parallel.apps.launcher.SSHControllerLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.SSHClusterLauncher, IPython.parallel.apps.launcher.ControllerMixin

class IPython.parallel.apps.launcher.SSHEngineLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.SSHClusterLauncher, IPython.parallel.apps.launcher.EngineMixin

class IPython.parallel.apps.launcher.SSHEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LocalEngineSetLauncher

engine_count

determine engine count from engines dict

start(n)

Start engines by profile or profile_dir. n is ignored, and the engines config property is used instead.

class IPython.parallel.apps.launcher.SSHProxyEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.SSHClusterLauncher

Launcher for calling ipcluster engines on a remote machine.

Requires that remote profile is already configured.

class IPython.parallel.apps.launcher.WindowsHPCLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.BaseLauncher

__init__(work_dir='.', config=None, **kwargs)
parse_job_id(output)

Take the output of the submit command and return the job id.

start(n)

Start n copies of the process using the Win HPC job scheduler.

class IPython.parallel.apps.launcher.WindowsHPCControllerLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.WindowsHPCLauncher, IPython.parallel.apps.launcher.ClusterAppMixin

start()

Start the controller by profile_dir.

class IPython.parallel.apps.launcher.WindowsHPCEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.WindowsHPCLauncher, IPython.parallel.apps.launcher.ClusterAppMixin

start(n)

Start the controller by profile_dir.

class IPython.parallel.apps.launcher.BatchClusterAppMixin(*args, **kw)

Bases: IPython.parallel.apps.launcher.ClusterAppMixin

ClusterApp mixin that updates the self.context dict, rather than cl-args.

class IPython.parallel.apps.launcher.BatchSystemLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.BaseLauncher

Launch an external process using a batch system.

This class is designed to work with UNIX batch systems like PBS, LSF, GridEngine, etc. The overall model is that there are different commands like qsub, qdel, etc. that handle the starting and stopping of the process.

This class also has the notion of a batch script. The batch_template attribute can be set to a string that is a template for the batch script. This template is instantiated using string formatting. Thus the template can use {n} fot the number of instances. Subclasses can add additional variables to the template dict.

__init__(work_dir='.', config=None, **kwargs)
parse_job_id(output)

Take the output of the submit command and return the job id.

start(n)

Start n copies of the process using a batch system.

write_batch_script(n)

Instantiate and write the batch script to the work_dir.

class IPython.parallel.apps.launcher.PBSLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.BatchSystemLauncher

A BatchSystemLauncher subclass for PBS.

class IPython.parallel.apps.launcher.PBSControllerLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.PBSLauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch a controller using PBS.

start()

Start the controller by profile or profile_dir.

class IPython.parallel.apps.launcher.PBSEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.PBSLauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch Engines using PBS

class IPython.parallel.apps.launcher.SGELauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.PBSLauncher

Sun GridEngine is a PBS clone with slightly different syntax

class IPython.parallel.apps.launcher.SGEControllerLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.SGELauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch a controller using SGE.

start()

Start the controller by profile or profile_dir.

class IPython.parallel.apps.launcher.SGEEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.SGELauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch Engines with SGE

class IPython.parallel.apps.launcher.LSFLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.BatchSystemLauncher

A BatchSystemLauncher subclass for LSF.

start(n)

Start n copies of the process using LSF batch system. This cant inherit from the base class because bsub expects to be piped a shell script in order to honor the #BSUB directives : bsub < script

class IPython.parallel.apps.launcher.LSFControllerLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LSFLauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch a controller using LSF.

start()

Start the controller by profile or profile_dir.

class IPython.parallel.apps.launcher.LSFEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LSFLauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch Engines using LSF

class IPython.parallel.apps.launcher.HTCondorLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.BatchSystemLauncher

A BatchSystemLauncher subclass for HTCondor.

HTCondor requires that we launch the ipengine/ipcontroller scripts rather that the python instance but otherwise is very similar to PBS. This is because HTCondor destroys sys.executable when launching remote processes - a launched python process depends on sys.executable to effectively evaluate its module search paths. Without it, regardless of which python interpreter you launch you will get the to built in module search paths.

We use the ip{cluster, engine, controller} scripts as our executable to circumvent this - the mechanism of shebanged scripts means that the python binary will be launched with argv[0] set to the location of the ip{cluster, engine, controller} scripts on the remote node. This means you need to take care that:

  1. Your remote nodes have their paths configured correctly, with the ipengine and ipcontroller of the python environment you wish to execute code in having top precedence.
  2. This functionality is untested on Windows.

If you need different behavior, consider making you own template.

class IPython.parallel.apps.launcher.HTCondorControllerLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.HTCondorLauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch a controller using HTCondor.

start()

Start the controller by profile or profile_dir.

class IPython.parallel.apps.launcher.HTCondorEngineSetLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.HTCondorLauncher, IPython.parallel.apps.launcher.BatchClusterAppMixin

Launch Engines using HTCondor

class IPython.parallel.apps.launcher.IPClusterLauncher(work_dir='.', config=None, **kwargs)

Bases: IPython.parallel.apps.launcher.LocalProcessLauncher

Launch the ipcluster program in an external process.

2 Functions

IPython.parallel.apps.launcher.check_output(*popenargs, timeout=None, **kwargs)

Run command with arguments and return its output.

If the exit code was non-zero it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute and output in the output attribute.

The arguments are the same as for the Popen constructor. Example:

>>> check_output(["ls", "-l", "/dev/null"])
b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'

The stdout argument is not allowed as it is used internally. To capture standard error in the result, use stderr=STDOUT.

>>> check_output(["/bin/sh", "-c",
...               "ls -l non_existent_file ; exit 0"],
...              stderr=STDOUT)
b'ls: non_existent_file: No such file or directory\n'

There is an additional optional argument, “input”, allowing you to pass a string to the subprocess’s stdin. If you use this argument you may not also use the Popen constructor’s “stdin” argument, as it too will be used internally. Example:

>>> check_output(["sed", "-e", "s/foo/bar/"],
...              input=b"when in the course of fooman events\n")
b'when in the course of barman events\n'

If universal_newlines=True is passed, the “input” argument must be a string and the return value will be a string rather than bytes.

IPython.parallel.apps.launcher.find_job_cmd()