Though pip, you can add more packages to your python/Jupyter. The folloing are examples that apply to python3
Open a Terminal in JupyterLab and run the following command:
python3 -m pip install pycuda --user
To test whether your PYCUDA works, try this python script in JupyterLab. The script contains two URLs that explain the concepts of CUDA thread blocks and thread indexing.
DASK allows you to break down a large calculation into smaller parallel pieces. Here we are mostly interested in using DASK to spread the calculation via multiple SLURM jobs.
To enable DASK distributed scheduling/running with SLURM, first make sure when you start a JupyterLab instance, you choose an image that supports SLURM job submission, such as atlas-jupyter-w-slurm-cli/20200714
. You will also need to use pip3 to install dask-jobqueue
and distributed
:
Open a Terminal in JupyterLab and run the following command:
python3 -m pip install --ignore-installed dask numpy dask-jobqueue distributed --user
To test whether it works, try this python script in JupyterLab. Please pay attention to the line python="/usr/bin/python3"
in the script - do not forget about it.
You can use your own Conda environment along with the ATLAS Jupyter enviornment. Suppose you want to setup pyhf via Conda, and use it in ATLAS Jyputer environment, here is what you can do:
Open a Terminal in JupyterLab, you have two choices: “python3 -m pip install pyhf
” is the easiest way (but not via Conda). We want to show how to do this in an Conda environment, and make it available in Jupyter.
Suppose you already have miniconda3 or Anaconda3 install:
conda create --name mypyhf
conda activate mypyhf
conda install ipykernel pyhf
python3 -m ipykernel install --user --name=mypyhf
jupyter kernelspec uninstall mypyhf
)After this and restart the Jupyter environment, you will see a new kernel call mypyhf