Evolving a PDF
Method 1: Using apply_pdf
In this first part, we compute the eko and subsequently apply the initial PDF “manually” calling a dedicated function.
Next, we need to load the PDF that we want to evolve. EKO uses the same interface as lhapdf to query for the actual values of PDFs. However, for the scope of this tutorial we want to avoid the complication of dealing with an external dependency. Therefore we will use the toy PDFs as they were established by the Les Houches benchmark setting. They are provided in the banana-hep
package available from PyPI, so first run in your shell:
$ pip install banana-hep
and then in your python interpreter
[1]:
import pathlib
from banana import toy
import eko
pdf = toy.mkPDF("", 0)
Now, we have all ingredients at hand to evolve the PDF set with the operator:
[2]:
from ekobox.apply import apply_pdf
with eko.EKO.read("./myeko.tar") as evolution_operator:
evolved_pdfs, _integration_errors = apply_pdf(evolution_operator, pdf)
The function returns two dictionaries, one filled with PDF values for each final scale (evolved_pdfs
), and the other containing the integration errors (_integration_errors
). Both of them have the same structure, for example, let’s inspect evolved_pdfs
:
[3]:
evolved_pdfs.keys()
[3]:
dict_keys([(10000.0, 5)])
Each final scale contains a dictionary, where all PDF values are hold. Those are mapped with the Monte Carlo particle identifiers onto a the PDF value at the requested interpolation points.
E.g. to access the gluon PDF at \(Q^2 = 10000\,\text{GeV}^2\) you can run:
[4]:
evolved_pdfs[(10000.0, 5)][21]
[4]:
array([ 2.58966998e+04, 4.74768306e+02, 3.57939013e+01, -1.03946292e+01,
0.00000000e+00])
Note that we return the actual PDF and not the momentum fraction times the PDF.
Method 2: Using evolve_pdfs
In this second part we illustrate how to get (and install) directly a LHAPDF set evolved with eko.
First, we define our initial PDF. Here, we will use the same toy PDF as in the previous example, but any LHAPDF-like object will do the job!
[5]:
from banana import toy
pdf = toy.mkPDF("", 0)
Now, we set the theory inputs: in this example we will evolve our toy PDF at LO and create a new LHAPDF object with a size two mu2grid
.
[6]:
from ekobox.cards import example
th_card = example.theory()
op_card = example.operator()
# here we replace the grid with a very minimal one, to speed up the example
op_card.xgrid = eko.interpolation.XGrid([1e-3, 1e-2, 1e-1, 5e-1, 1.0])
op_card.mugrid = [(10.0, 5), (100.0, 5)]
# set QCD LO evolution
th_card.orders = (1, 0)
Finally, we are ready to run eko and install the new PDF set. Note, that if the evolved PDF already exist the code will overwrite it.
Additionally, you can set path
to load a precomputed EKO, while setting store_path
you can save the produced EKO and reuse it later. You can also iterate on the given PDF objects (e.g. replicas).
[7]:
from ekobox.evol_pdf import evolve_pdfs
path = pathlib.Path("./myeko2.tar")
path.unlink(missing_ok=True)
evolve_pdfs([pdf], th_card, op_card, install=True, name="Evolved_PDF", store_path=path)
install_pdf Evolved_PDF
Now, you can access the evolved PDF as all the other PDF sets (note that this requires the Python interface of lhapdf).
[8]:
import lhapdf
evolved_pdf = lhapdf.mkPDF("Evolved_PDF", 0)
LHAPDF 6.5.0 loading /Users/giacomomagni/.conda/envs/eko_dev/share/LHAPDF/Evolved_PDF/Evolved_PDF_0000.dat
Evolved_PDF PDF set, member #0, version 1
To obtain the value of the gluon PDF at a given scale you can simply do:
[9]:
pid = 21 # gluon pid
Q2 = 89.10 # Q^2 in Gev^2
x = 0.01 # momentum fraction
# check that the particle is present
print("has gluon?", evolved_pdf.hasFlavor(pid))
# now do the lookup
xg = evolved_pdf.xfxQ2(pid, x, Q2)
print(f"xg(x={x}, Q2={Q2}) = {xg}")
has gluon? True
xg(x=0.01, Q2=89.1) = 4.399576390180355
A more Realistic Example: Benchmark to CT14llo
In this part of the tutorial we do an eko benchmark showing how PDFs evolved with eko can reproduce the values from the original LHAPDF grids.
First, we need to set up the theory and operator runcards to match the settings used to produce the chosen PDF, here we will use CT14llo
.
We have to use LO evolution and we choose to dump our PDF into grids with 5 values of Q2
and 60 points in x-space logarithmically spaced between 1e-7 and 0.1 and linearly spaced from 0.1 to 1.
[10]:
from math import nan
import lhapdf
import numpy as np
from eko.interpolation import make_grid
from eko.quantities.heavy_quarks import HeavyQuarks, QuarkMassRef
from ekobox.cards import example
# get the PDF object
ct14llo = lhapdf.mkPDF("CT14llo")
# setup the operator card
op_card = example.operator()
op_card.xgrid = eko.interpolation.XGrid(make_grid(30, 30)) # x grid
op_card.mugrid = [(float(q), 5) for q in np.geomspace(5.0, 100, 5)] # Q2 grid
op_card.init = (1.295000, 3) # starting point for the evolution
# setup the theory card - this can be mostly inferred from the PDF's .info file
th_card = example.theory()
th_card.orders = (1, 0) # QCD LO
th_card.heavy.masses = HeavyQuarks(
[QuarkMassRef([1.3, nan]), QuarkMassRef([4.75, nan]), QuarkMassRef([172.0, nan])]
) # quark mass
th_card.couplings.alphas = 0.130000 # reference value of alpha_s
th_card.couplings.ref = (
91.1876,
5,
) # the reference scale together with the number of flavors at which alpha_s is provided
LHAPDF 6.5.0 loading /Users/giacomomagni/.conda/envs/eko_dev/share/LHAPDF/CT14llo/CT14llo_0000.dat
CT14llo PDF set, member #0, version 1; LHAPDF ID = 13205
Next, we run the evolution using method 2 and save the new PDF. Due to the extended x grid and Q2 grid this might take a minute so please be patient …
[22]:
from ekobox.evol_pdf import evolve_pdfs
path = pathlib.Path("./myeko_ct14llo.tar")
path.unlink(missing_ok=True)
evolve_pdfs(
[ct14llo], th_card, op_card, install=True, name="my_ct14llo", store_path=path
)
install_pdf my_ct14llo
Now, we can compare the values given by the original PDF set and the one evolved with eko, both at different x
and Q2
scales, for a chosen parton, here we look at the gluon:
[23]:
import pandas as pd
# load evolved pdf
my_ct14llo = lhapdf.mkPDF("my_ct14llo", 0)
pid = 21 # gluon pid
# collect data
log = {"x": [], "Q2": [], "ct14llo": [], "my_ct14llo": [], "relative_diff": []}
for q in np.geomspace(5.0, 100, 5):
q2 = q**2.0
for x in np.geomspace(1e-5, 0.9, 5):
value = ct14llo.xfxQ2(pid, x, q2)
my_value = my_ct14llo.xfxQ2(pid, x, q2)
log["x"].append(x)
log["Q2"].append(q2)
log["ct14llo"].append(value)
log["my_ct14llo"].append(my_value)
log["relative_diff"].append((value - my_value) / value)
print(pd.DataFrame(log))
LHAPDF 6.5.0 loading /Users/giacomomagni/.conda/envs/eko_dev/share/LHAPDF/my_ct14llo/my_ct14llo_0000.dat
x Q2 ct14llo my_ct14llo relative_diff
0 0.000010 25.000000 7.635785e+01 7.630727e+01 0.000662
1 0.000173 25.000000 3.194273e+01 3.192242e+01 0.000636
2 0.003000 25.000000 1.081843e+01 1.081161e+01 0.000631
3 0.051962 25.000000 1.958956e+00 1.958820e+00 0.000069
4 0.900000 25.000000 1.922415e-05 1.955338e-05 -0.017126
5 0.000010 111.803399 1.333957e+02 1.333029e+02 0.000696
6 0.000173 111.803399 4.777286e+01 4.773856e+01 0.000718
7 0.003000 111.803399 1.341028e+01 1.340044e+01 0.000734
8 0.051962 111.803399 1.978216e+00 1.978292e+00 -0.000038
9 0.900000 111.803399 6.644805e-06 6.754169e-06 -0.016459
10 0.000010 500.000000 1.967032e+02 1.965518e+02 0.000770
11 0.000173 500.000000 6.291393e+01 6.286326e+01 0.000805
12 0.003000 500.000000 1.542347e+01 1.541073e+01 0.000826
13 0.051962 500.000000 1.947465e+00 1.947532e+00 -0.000034
14 0.900000 500.000000 2.929060e-06 2.977019e-06 -0.016373
15 0.000010 2236.067977 2.633266e+02 2.631190e+02 0.000789
16 0.000173 2236.067977 7.708540e+01 7.702204e+01 0.000822
17 0.003000 2236.067977 1.700410e+01 1.699004e+01 0.000827
18 0.051962 2236.067977 1.893923e+00 1.894094e+00 -0.000090
19 0.900000 2236.067977 1.544450e-06 1.570506e-06 -0.016870
20 0.000010 10000.000000 3.314097e+02 3.311451e+02 0.000799
21 0.000173 10000.000000 9.023010e+01 9.015575e+01 0.000824
22 0.003000 10000.000000 1.825934e+01 1.824476e+01 0.000798
23 0.051962 10000.000000 1.830992e+00 1.831291e+00 -0.000163
24 0.900000 10000.000000 9.288458e-07 9.442889e-07 -0.016626
my_ct14llo PDF set, member #0, version 1
As you can see EKO is able to reproduce the numbers from the original LHAPDF grid mostly below the permille level.
The accuracy is mainly limited by the number of points in the x
and Q2
grids that can be finer to achieve higher precision.
You can also notice that at large-x the gluon pdf vanishes so the worst accuracy of our benchmark is not worrying at all.