cape.cfdx.databook
: Post-processed data module¶
This module contains functions for reading and processing forces, moments, and other entities from cases in a trajectory. This module forms the core for all database post-processing in Cape, but several other database modules exist for more specific applications:
This module provides three basic classes upon which more specific data classes are developed:
DataBook
: Overall databook container
DataBookComp
: Template databook for an individual component
CaseData
: Template class for one case’s iterative history
The first two of these are subclassed from dict
, so that
generic data can be accessed with syntax such as DB[coeff]
for an
appropriately named coefficient. An outline of derived classes for
these three templates is shown below.
DataBook
TriqFMDataBook
: post-processed forces & moments
DataBookComp
FMDataBook
: force & moment data, one comp
TargetDataBook
: target data
TriqFMFaceDataBook
: surface CP FM for one comp
LineLoadDataBook
: sectional load databook
PointSensorGroupDataBook
: group of points
TriqPointGroupDataBook
: group of surface points
PointSensorDataBook
: one point sensor
TriqPointDataBook
: one surface point sensor
CaseData
CaseFM
: iterative force & moment history
CaseResid
: iterative residual history
In addition, each solver has its own version of this module:
cape.pycart.dataBook
cape.pyfun.dataBook
cape.pyover.dataBook
The parent class cape.cfdx.databook.DataBook
provides a common
interface to all of the requested force, moment, point sensor, etc.
quantities that have been saved in the data book. Informing cape
which quantities to track, and how to statistically process them, is
done using the "DataBook"
section of the JSON file, and the various
data book options are handled within the API using the
cape.cfdx.options.DataBook
module.
The master data book class cape.cfdx.databook.DataBook
is based
on the built-in dict
class with keys pointing to force and
moment data books for individual components. For example, if the JSON
file tells Cape to track the forces and/or moments on a component called
"body"
, and the data book is the variable DB, then the forces and
moment data book is DB["body"]
. This force and moment data book
contains statistically averaged forces and moments and other statistical
quantities for every case in the run matrix. The class of the force and
moment data book is cape.cfdx.databook.FMDataBook
.
The data book also has the capability to store “target” data books so
that the user can compare results of the current CFD solutions to
previous results or experimental data. These are stored in
DB["Targets"]
and use the cape.cfdx.databook.TargetDataBook
class. Other types of data books can also be created, such as the
cape.cfdx.pointsensor.PointSensorDataBook
class for tracking
statistical properties at individual points in the solution field. Data
books for tracking results of groups of cases are built off of the
cape.cfdx.databook.DataBookComp
class, which contains many common
tools such as plotting.
The cape.cfdx.dataBook
module also contains modules for
processing results within individual case folders. This includes the
cape.cfdx.databook.CaseFM
module for reading iterative
force/moment histories and the cape.cfdx.databook.CaseResid
for iterative histories of residuals.
- class cape.cfdx.databook.DataBook(cntl, RootDir: str | None = None, targ: str | None = None, **kw)¶
Interface to the data book for a given CFD run matrix
- Call:
>>> DB = cape.cfdx.databook.DataBook(cntl, **kw)
- Inputs:
- Outputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- DB.x:
cape.runmatrix.RunMatrix
Run matrix of rows saved in the data book
- DB[comp]:
cape.cfdx.databook.FMDataBook
Component data book for component comp
- DB.Components:
list
[str
] List of force/moment components
- DB.Targets:
dict
Dictionary of
TargetDataBook
target data books
- DB:
- Versions:
2014-12-20
@ddalle
: Started2015-01-10
@ddalle
: v1.02022-03-07
@ddalle
: v1.1; allow .cntl
- DeleteTriqPoint(I, comp=None)¶
Delete list of cases from TriqPoint component data books
- Call:
>>> DB.DeleteTriqPoint(I, comp=None)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the data book class
- I: {
None
} |list
[int
] List or array of run matrix indices
- comp: {
None
} |str
|list
Component wild card or list of component wild cards
- DB:
- Versions:
2017-10-11
@ddalle
: v1.0
- DeleteTriqPointComp(comp, I=None)¶
Delete list of cases from a TriqPoint component data book
- Call:
>>> n = DB.DeleteTriqPointComp(comp, I=None)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the data book class
- comp:
str
Name of component
- I: {
None
} |list
[int
] List or array of run matrix indices
- DB:
- Outputs:
- n:
list
Number of deletions made
- n:
- Versions:
2017-04-25
@ddalle
: v1.02017-10-11
@ddalle
: FromDeleteTriqFMComp()
- FindMatch(i: int) int | None ¶
Find an entry by run matrix (trajectory) variables
It is assumed that exact matches can be found.
- Call:
>>> j = DB.FindMatch(i)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- i:
int
Index of the case from the trajectory to try match
- DB:
- Outputs:
- j:
numpy.ndarray
[int
] Array of index(es) that match case i or
NaN
- j:
- Versions:
2016-02-27
@ddalle
: Added as a pointer to first component
- FindTargetMatch(DBT, i, topts, keylist='tol', **kw)¶
Find a target entry by run matrix (trajectory) variables
Cases will be considered matches by comparing variables specified in the topts variable, which shares some of the options from the
"Targets"
subsection of the"DataBook"
section ofcape.json
. Suppose that topts contains the following:{ "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"} "Tolerances": { "alpha": 0.05, "Mach": 0.01 }, "Keys": ["alpha", "Mach", "beta"] }
Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled
"MACH"
) and alpha to within 0.05 is considered a match. Because the Keys parameter contains"beta"
, the search will also look for exact matches in"beta"
.If the Keys parameter is not set, the search will use either all the keys in the trajectory, x.cols, or just the keys specified in the
"Tolerances"
section of topts. Which of these two default lists to use is determined by the keylist input.- Call:
>>> j = DB.FindTargetMatch(DBT, i, topts, **kw)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- DBT:
DataBookComp
|TargetDataBook
Target component databook
- i:
int
Index of the case from the trajectory to try match
- topts:
dict
|TargetDataBook
Criteria used to determine a match
- keylist:
"x"
| {"tol"
} Source for default list of keys
- source: {
"self"
} |"target"
Match DB case i or DBT case i
- DB:
- Outputs:
- j:
numpy.ndarray
[int
] Array of indices that match the trajectory
- j:
- See also:
- Versions:
2016-02-27
@ddalle
: Added as a pointer to first component2018-02-12
@ddalle
: First input x -> DBT
- GetDBMatch(j, ftarg, tol=0.0, tols=None)¶
Get index of a target match (if any) for one data book entry
- Call:
>>> i = DB.GetDBMatch(j, ftarg, tol=0.0, tols={})
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of a data book class
- j:
int
|np.nan
Data book target index
- ftarg:
str
Name of the target and column
- tol:
float
Tolerance for matching all keys (
0.0
enforces equality)- tols:
dict
Dictionary of specific tolerances for each key
- DB:
- Outputs:
- i:
int
Data book index
- i:
- Versions:
2015-08-30
@ddalle
: v1.0
- GetRefComponent()¶
Get first component with type ‘FM’, ‘Force’, or ‘Moment’
- Call:
>>> DBc = DB.GetRefComponent()
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Data book instance
- DB:
- Outputs:
- DBc:
cape.cfdx.databook.FMDataBook
Data book for one component
- DBc:
- Versions:
2016-08-18
@ddalle
: v1.0
- GetTargetByName(targ)¶
Get a target handle by name of the target
- Call:
>>> DBT = DB.GetTargetByName(targ)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the data book class
- targ:
str
Name of target to find
- DB:
- Outputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the pyCart data book target class
- DBT:
- Versions:
2015-06-04
@ddalle
: v1.0
- GetTargetMatch(i, ftarg, tol=0.0, tols=None)¶
Get index of a target match for one data book entry
- Call:
>>> j = DB.GetTargetMatch(i, ftarg, tol=0.0, tols={})
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- i:
int
Data book index
- ftarg:
str
Name of the target and column
- tol:
float
Tolerance for matching all keys
- tols:
dict
Dictionary of specific tolerances for each key
- DB:
- Outputs:
- j:
int
|np.nan
Data book target index
- j:
- Versions:
2015-08-30
@ddalle
: v1.0
- GetTargetMatches(ftarg, tol=0.0, tols={})¶
Get vectors of indices matching targets
- Call:
>>> I, J = DB.GetTargetMatches(ftarg, tol=0.0, tols={})
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- ftarg:
str
Name of the target and column
- tol:
float
Tolerance for matching all keys
- tols:
dict
Dictionary of specific tolerances for each key
- DB:
- Outputs:
- I:
np.ndarray
Array of data book indices with matches
- J:
np.ndarray
Array of target indices for each data book index
- I:
- Versions:
2015-08-30
@ddalle
: v1.0
- MatchRunMatrix()¶
Restrict the data book object to points in the trajectory
- Call:
>>> DB.MatchRunMatrix()
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- DB:
- Versions:
2015-05-28
@ddalle
: v1.0
- PlotCoeff(comp, coeff, I, **kw)¶
Plot a sweep of one coefficients over several cases
- Call:
>>> h = DB.PlotCoeff(comp, coeff, I, **kw)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the data book class
- comp:
str
Component whose coefficient is being plotted
- coeff:
str
Coefficient being plotted
- I:
np.ndarray
[int
] List of indexes of cases to include in sweep
- DB:
- Keyword Arguments:
- x: [ {None} |
str
] RunMatrix key for x axis (else plot against index)
- Label: {comp} |
str
Manually specified label
- Legend: {
True
} |False
Whether or not to use a legend
- StDev: {
None
} |float
Multiple of iterative history standard deviation to plot
- MinMax:
True
| {False
} Option to plot min and max from iterative history
- Uncertainty:
True
| {False
} Whether to plot direct uncertainty
- PlotOptions:
dict
Plot options for the primary line(s)
- StDevOptions:
dict
Plot options for the standard deviation plot
- MinMaxOptions:
dict
Plot options for the min/max plot
- UncertaintyOptions:
dict
Dictionary of plot options for the uncertainty plot
- FigureWidth:
float
Width of figure in inches
- FigureHeight:
float
Height of figure in inches
- PlotTypeStDev: {
"FillBetween"
} |"ErrorBar"
Plot function to use for standard deviation plot
- PlotTypeMinMax: {
"FillBetween"
} |"ErrorBar"
Plot function to use for min/max plot
- PlotTypeUncertainty:
"FillBetween"
| {"ErrorBar"
} Plot function to use for uncertainty plot
- x: [ {None} |
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- See also:
- Versions:
2015-05-30
@ddalle
: v1.02015-12-14
@ddalle
: Added error bars
- PlotContour(comp, coeff, I, **kw)¶
Create a contour plot of one coefficient over several cases
- Call:
>>> h = DB.PlotContour(comp, coeff, I, **kw)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the data book class
- comp:
str
Component whose coefficient is being plotted
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DB:
- Keyword Arguments:
- x:
str
RunMatrix key for x axis
- y:
str
RunMatrix key for y axis
- ContourType: {“tricontourf”} | “tricontour” | “tripcolor”
Contour plotting function to use
- LineType: {“plot”} | “triplot” | “none”
Line plotting function to highlight data points
- Label: [ {comp} |
str
] Manually specified label
- ColorBar: [ {
True
} |False
] Whether or not to use a color bar
- ContourOptions:
dict
Plot options to pass to contour plotting function
- PlotOptions:
dict
Plot options for the line plot
- FigureWidth:
float
Width of figure in inches
- FigureHeight:
float
Height of figure in inches
- x:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- See also:
- Versions:
2015-05-30
@ddalle
: v1.02015-12-14
@ddalle
: Added error bars
- ProcessComps(comp=None, **kw)¶
Process list of components
This performs several conversions:
- Call:
>>> DB.ProcessComps(comp=None)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pyCart data book class
- comp: {
None
} |list
|str
Component or list of components
- DB:
- Versions:
2017-04-13
@ddalle
: v1.0
- ReadCaseFM(comp)¶
Read a
CaseFM
object- Call:
>>> fm = DB.ReadCaseFM(comp)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- comp:
str
Name of component
- DB:
- Outputs:
- fm:
cape.cfdx.databook.CaseFM
Residual history class
- fm:
- Versions:
2017-04-13
@ddalle
: First separate version
- ReadCaseProp(comp)¶
Read a
CaseProp
object- Call:
>>> prop = DB.ReadCaseProp(comp)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- comp:
str
Name of component
- DB:
- Outputs:
- prop:
cape.cfdx.databook.CaseProp
Generic-property iterative history instance
- prop:
- Versions:
2022-04-08
@ddalle
: v1.0
- ReadCaseResid()¶
Read a
CaseResid
object- Call:
>>> H = DB.ReadCaseResid()
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- DB:
- Outputs:
- H:
cape.cfdx.databook.CaseResid
Residual history class
- H:
- Versions:
2017-04-13
@ddalle
: First separate version
- ReadCaseTS(comp)¶
Read a
CaseFM
object- Call:
>>> fm = DB.ReadCaseFM(comp)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- comp:
str
Name of component
- DB:
- Outputs:
- fm:
cape.cfdx.databook.CaseFM
Residual history class
- fm:
- Versions:
2017-04-13
@ddalle
: First separate version
- ReadDBCaseProp(comp, check=False, lock=False)¶
Initialize data book for one component
- Call:
>>> DB.InitDBComp(comp, check=False, lock=False)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pyCart data book class
- comp:
str
Name of component
- check:
True
| {False
} Whether or not to check for LOCK file
- lock:
True
| {False
} Whether or not to create LOCK file
- DB:
- Versions:
2015-11-10
@ddalle
: v1.02017-04-13
@ddalle
: Self-contained and renamed
- ReadDBCompTS(comp, check=False, lock=False)¶
Initialize time series data book for one component
- Call:
>>> DB.InitDBComp(comp, check=False, lock=False)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pyCart data book class
- comp:
str
Name of component
- check:
True
| {False
} Whether or not to check for LOCK file
- lock:
True
| {False
} Whether or not to create LOCK file
- DB:
- Versions:
2015-11-10
@ddalle
: v1.02017-04-13
@ddalle
: Self-contained and renamed
- ReadFM(comp, check=False, lock=False)¶
Initialize data book for one component
- Call:
>>> DB.InitFMDataBook(comp, check=False, lock=False)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pyCart data book class
- comp:
str
Name of component
- check:
True
| {False
} Whether or not to check for LOCK file
- lock:
True
| {False
} Whether or not to create LOCK file
- DB:
- Versions:
2015-11-10
@ddalle
: v1.02017-04-13
@ddalle
: Self-contained and renamed
- ReadLineLoad(comp, conf=None, targ=None, **kw)¶
Read a line load data
- Call:
>>> DB.ReadLineLoad(comp)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pycart data book class
- comp:
str
Line load component group
- conf: {
None
} |cape.config.Config
Surface configuration interface
- targ: {
None
} |str
Alternate directory to read from, else DB.targ
- DB:
- Versions:
2015-09-16
@ddalle
: v1.02016-06-27
@ddalle
: Added targ
- ReadPyFuncDataBook(comp, check=False, lock=False)¶
Initialize data book for one PyFunc component
- Call:
>>> DB.ReadPyFuncDataBook(comp, check=False, lock=False)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pyCart data book class
- comp:
str
Name of component
- check:
True
| {False
} Whether or not to check for LOCK file
- lock:
True
| {False
} Whether or not to create LOCK file
- DB:
- Versions:
2022-04-10
@ddalle
: v1.0
- ReadTarget(targ)¶
Read a data book target if it is not already present
- Call:
>>> DB.ReadTarget(targ)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- targ:
str
Target name
- DB:
- Versions:
2015-09-16
@ddalle
: v1.0
- Sort(key=None, I=None)¶
Sort a data book according to either a key or an index
- Call:
>>> DB.Sort() >>> DB.Sort(key) >>> DB.Sort(I=None)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- key:
str
|list
[str
] Name of trajectory key or list of keys on which to sort
- I:
np.ndarray
[int
] List of indices; must have same size as data book
- DB:
- Versions:
2014-12-30
@ddalle
: v1.02015-06-19
@ddalle
: New multi-key sort2016-01-13
@ddalle
: Checks to allow incomplete comps
- UpdateRunMatrix()¶
Match the trajectory to the cases in the data book
- Call:
>>> DB.UpdateRunMatrix()
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- DB:
- Versions:
2015-05-22
@ddalle
: v1.0
- UpdateTriqPoint(I, comp=None)¶
Update a TriqPoint triangulation-extracted point sensor data book
- Call:
>>> DB.UpdateTriqPoint(I, comp=None)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- I:
list
[int
] List or array of run matrix indices
- comp: {
None
} |str
Name of TriqPoint group or all if
None
- DB:
- Versions:
2017-10-11
@ddalle
: v1.0
- UpdateTriqPointComp(comp, I=None)¶
Update a TriqPoint triangulation-extracted data book
- Call:
>>> n = DB.UpdateTriqPointComp(comp, I=None)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- comp: {
None
} |str
Name of TriqPoint group or all if
None
- I: {
None
} |list
[int
] List or array of run matrix indices
- DB:
- Outputs:
- n:
int
Number of updates made
- n:
- Versions:
2017-10-11
@ddalle
: v1.0
- Write(unlock=True)¶
Write the current data book in Python memory to file
- Call:
>>> DB.Write(unlock=True)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- DB:
- Versions:
2014-12-22
@ddalle
: v1.02015-06-19
@ddalle
: New multi-key sort2017-06-12
@ddalle
: Added unlock
- mkdir(fdir)¶
Create a directory using settings from DataBook>umask
- Call:
>>> DB.mkdir(fdir)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- fdir:
str
Directory to create
- DB:
- Versions:
2017-09-05
@ddalle
: v1.0
- class cape.cfdx.databook.DataBookComp(comp, cntl, check=False, lock=False, **kw)¶
Individual item data book basis class
- Call:
>>> DBi = DataBookComp(comp, cntl, check=False, lock=False)
- Inputs:
- comp:
str
Name of the component or other item name
- cntl:
Cntl
CAPE control class instance
- check:
True
| {False
} Whether or not to check LOCK status
- lock:
True
| {False
} If
True
, wait if the LOCK file exists
- comp:
- Outputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- DBi:
- Versions:
2014-12-22
@ddalle
: v1.02015-12-04
@ddalle
: v1.0 (forkDBComp
)2025-01-22
@aburkhea
: v2.02025-05-25
@ddalle
: v2.1; rename DBBase -> DataBookComp
- ArgSort(key=None)¶
Return indices that would sort a data book by a trajectory key
- Call:
>>> I = DBi.ArgSort(key=None)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- key:
str
Name of trajectory key to use for sorting; default is first key
- DBi:
- Outputs:
- I:
numpy.ndarray
[int
] List of indices; must have same size as data book
- I:
- Versions:
2014-12-30
@ddalle
: v1.0
- CheckLock()¶
Check if lock file for this component exists
- Call:
>>> q = DBc.CheckLock()
- Inputs:
- DBc:
cape.cfdx.databook.DataBookBase
Data book base object
- DBc:
- Outputs:
- q:
bool
Whether or not corresponding LOCK file exists
- q:
- Versions:
2017-06-12
@ddalle
: v1.0
- DeleteCases(I, comp)¶
Delete list of cases from data book
- Call:
>>> n = DB.Delete(I)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pyCart data book class
- I:
list
[int
] List of trajectory indices
- DB:
- Outputs:
- n:
int
Number of deleted entries
- n:
- Versions:
2015-03-13
@ddalle
: v1.02017-04-13
@ddalle
: Split by component
- FindCaseIndex(j: int) int | None ¶
Find index of databook entry j in run matrix (if any)
- Call:
>>> i = db.FindCaseIndex(j)
- Inputs:
- db:
DataBookComp
Single databook component
- j:
int
Databook index
- db:
- Outputs:
- i:
None
|int
Run matrix index, if applicable
- i:
- Versions:
2024-10-16
@ddalle
: v1.0
- FindCoSweep(x, i, EqCons=[], TolCons={}, GlobCons=[], xkeys={})¶
Find data book entries meeting constraints seeded from point i
Cases will be considered matches if data book values match trajectory x point i. For example, if we have the following values for EqCons and TolCons have the following values:
EqCons = ["beta"] TolCons = {"alpha": 0.05, "mach": 0.01}
Then this method will compare DBc[“mach”] to x.mach[i]. Any case such that pass all of the following tests will be included.
abs(DBc["mach"] - x.mach[i]) <= 0.01 abs(DBc["alpha"] - x.alpha[i]) <= 0.05 DBc["beta"] == x.beta[i]
All entries must also meet a list of global constraints from GlobCons. Users can also use xkeys as a dictionary of alternate key names to compare to the trajectory. Consider the following values:
TolCons = {"alpha": 0.05} xkeys = {"alpha": "AOA"}
Then the test becomes:
abs(DBc["AOA"] - x.alpha[i]) <= 0.05
- Call:
>>> J = DBc.FindCoSweep(x, i, EqCons={}, TolCons={}, **kw)
- Inputs:
- DBc:
cape.cfdx.databook.DataBookComp
Data book component instance
- x:
cape.runmatrix.RunMatrix
RunMatrix (i.e. run matrix) to use for target value
- i:
int
Index of the case from the trajectory to try match
- EqCons: {
[]
} |list
(str
) List of variables that must match the trajectory exactly
- TolCons: {
{}
} |dict
[float
] List of variables that may match trajectory within a tolerance
- GlobCons: {
[]
} |list
(str
) List of global constraints, see
cape.RunMatrix.Filter()
- xkeys: {
{}
} |dict
(str
) Dictionary of alternative names of variables
- DBc:
- Outputs:
- J:
numpy.ndarray
[int
] Array of indices that match the trajectory within tolerances
- J:
- See also:
- Versions:
2014-12-21
@ddalle
: v1.02016-06-27
@ddalle
: Moved from TargetDataBook and generalized
- FindDBMatch(DBc, i: int)¶
Find the index of an exact match to case i in another databook
- Call:
>>> j = DBi.FindDBMatch(DBc, i)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
Data book base object
- DBc:
cape.cfdx.databook.DataBookComp
Another data book base object
- i:
int
Data book index for DBi
- DBi:
- Outputs:
- j:
None
|int
Data book index for DBj
- j:
- Versions:
2017-06-26
@ddalle
: v1.0
- FindMatch(i: int) int | None ¶
Find an entry by run matrix (trajectory) variables
It is assumed that exact matches can be found. However, run matrix keys that do not affect the name of the folder
- Call:
>>> j = db.FindMatch(i)
- Inputs:
- db:
DataBookComp
An individual-component data book
- i:
int
Run matrix index to match
- db:
- Outputs:
- j:
None
|int
Index of databook entry that matches run matrix case i
- j:
- Versions:
2014-12-22
@ddalle
: v1.02024-10-16
@ddalle
: v1.1; replace np.nan -> None
- FindTargetMatch(DBT, i, topts={}, keylist='tol', **kw)¶
Find a target entry by run matrix (trajectory) variables
Cases will be considered matches by comparing variables specified in the topts variable, which shares some of the options from the
"Targets"
subsection of the"DataBook"
section ofcape.json
. Suppose that topts contains the following{ "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"} "Tolerances": { "alpha": 0.05, "Mach": 0.01 }, "Keys": ["alpha", "Mach", "beta"] }
Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled
"MACH"
) and alpha to within 0.05 is considered a match. Because the Keys parameter contains"beta"
, the search will also look for exact matches in"beta"
.If the Keys parameter is not set, the search will use either all the keys in the trajectory, x.cols, or just the keys specified in the
"Tolerances"
section of topts. Which of these two default lists to use is determined by the keylist input.- Call:
>>> j = DBc.FindTargetMatch(DBT, i, topts, **kw)
- Inputs:
- DBc:
cape.cfdx.databook.DataBookComp
Instance of original databook
- DBT:
DataBookComp
|TargetDataBook
Target databook of any type
- i:
int
Case index from DBc.x for DBT.x to match
- topts:
dict
|TargetDataBook
Criteria used to determine a match
- keylist: {
"x"
} |"tol"
Test key source:
x.cols
|topts.Tolerances
- source:
"self"
| {"target"
} Match DBc.x case i if
"self"
, else DBT.x i
- DBc:
- Outputs:
- j:
numpy.ndarray
[int
] Array of indices that match within tolerances
- j:
- See also:
- Versions:
2014-12-21
@ddalle
: v1.02016-06-27
@ddalle
: v1.1; Moved from TargetDataBook2018-02-12
@ddalle
: v1.2; First argDataBookComp
- GetDeltaStats(DBT, comp, coeff, I, topts={}, **kw)¶
Calculate statistics on differences between two databooks
- Call:
>>> S = DBc.GetDeltaStats(DBT, coeff, I, topts=None, **kw)
- Inputs:
- DBc:
cape.cfdx.databook.DataBookComp
Component databook
- coeff:
str
Name of coefficient on which to compute statistics
- I:
list
[int
] Indices of cases/entries to consider
- topts: {
{}
} |dict
Dictionary of tolerances for variables in question
- keylist: {
"x"
} |"tol"
Default test key source:
x.cols
ortopts.Tolerances
- CombineTarget: {
True
} |False
For cases with multiple matches, compare to mean target value
- DBc:
- Outputs:
- S:
dict
Dictionary of statistical results
- S[“delta”]:
np.ndarray
Array of deltas for each valid case
- S[“n”]:
int
Number
- S[“mu”]:
float
Mean of histogram
- S:
- Versions:
2018-02-12
@ddalle
: v1.0
- GetLockFile()¶
Get the name of the potential lock file
- Call:
>>> flock = DBc.GetLockFile()
- Inputs:
- DBc:
cape.cfdx.databook.DataBookBase
Data book base object
- DBc:
- Outputs:
- flock:
str
Full path to potential
lock
file
- flock:
- Versions:
2017-06-12
@ddalle
: v1.0
- GetRunMatrixIndex(j)¶
Find an entry in the run matrix (trajectory)
- Call:
>>> i = DBi.GetRunMatrixIndex(self, j)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- j:
int
Index of the case from the databook to try match
- DBi:
- Outputs:
- i:
int
RunMatrix index or
None
- i:
- Versions:
2015-05-28
@ddalle
: v1.0
- Lock()¶
Write a ‘LOCK’ file for a data book component
- Call:
>>> DBc.Lock()
- Inputs:
- DBc:
cape.cfdx.databook.DataBookBase
Data book base object
- DBc:
- Versions:
2017-06-12
@ddalle
: v1.0
- Merge(DBc)¶
Merge another copy of the data book object
- Call:
>>> DBi.Merge(DBc)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
Component data book
- DBc:
cape.cfdx.databook.DataBookComp
Copy of component data book, perhaps read at a different time
- DBi:
- Versions:
2017-06-26
@ddalle
: v1.0
- PlotCoeff(coeff, I, **kw)¶
Plot a sweep of one coefficient over several cases
- Call:
>>> h = DBi.PlotCoeff(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2015-05-30
@ddalle
: v1.02015-12-14
@ddalle
: Added error bars
- PlotCoeffBase(coeff, I, **kw)¶
Plot sweep of one coefficient or quantity over several cases
This is the base method upon which data book sweep plotting is built. Other methods may call this one with modifications to the default settings. For example
cape.cfdx.databook.TargetDataBook.PlotCoeff()
changes the default PlotOptions to show a red line instead of the standard black line. All settings can still be overruled by explicit inputs to either this function or any of its children.- Call:
>>> h = DBi.PlotCoeffBase(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- x: {
None
} |str
RunMatrix key for x axis (or plot against index if
None
)- Label: {comp} |
str
Manually specified label
- Legend: {
True
} |False
Whether or not to use a legend
- StDev: {
None
} |float
Multiple of iterative history standard deviation to plot
- MinMax: {
False
} |True
Whether to plot minimum and maximum over iterative history
- Uncertainty: {
False
} |True
Whether to plot direct uncertainty
- PlotOptions:
dict
Plot options for the primary line(s)
- StDevOptions:
dict
Dictionary of plot options for the standard deviation plot
- MinMaxOptions:
dict
Dictionary of plot options for the min/max plot
- UncertaintyOptions:
dict
Dictionary of plot options for the uncertainty plot
- FigureWidth:
float
Width of figure in inches
- FigureHeight:
float
Height of figure in inches
- PlotTypeStDev: {
'FillBetween'
} |'ErrorBar'
Plot function to use for standard deviation plot
- PlotTypeMinMax: {
'FillBetween'
} |'ErrorBar'
Plot function to use for min/max plot
- PlotTypeUncertainty:
'FillBetween'
| {'ErrorBar'
} Plot function to use for uncertainty plot
- LegendFontSize: {
9
} |int
> 0 |float
Font size for use in legends
- Grid: {
None
} |True
|False
Turn on/off major grid lines, or leave as is if
None
- GridStyle: {
{}
} |dict
Dictionary of major grid line line style options
- MinorGrid: {
None
} |True
|False
Turn on/off minor grid lines, or leave as is if
None
- MinorGridStyle: {
{}
} |dict
Dictionary of minor grid line line style options
- x: {
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2015-05-30
@ddalle
: v1.02015-12-14
@ddalle
: Added error bars
- PlotContour(coeff, I, **kw)¶
Create a contour plot for a subset of cases
- Call:
>>> h = DBi.PlotContour(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2017-04-17
@ddalle
: v1.0
- PlotContourBase(coeff, I, **kw)¶
Create a contour plot of selected data points
- Call:
>>> h = DBi.PlotContourBase(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- x:
str
RunMatrix key for x axis
- y:
str
RunMatrix key for y axis
- ContourType: {“tricontourf”} | “tricontour” | “tripcolor”
Contour plotting function to use
- LineType: {“plot”} | “triplot” | “none”
Line plotting function to highlight data points
- Label: [ {comp} |
str
] Manually specified label
- ColorMap: {
"jet"
} |str
Name of color map to use
- ColorBar: [ {
True
} |False
] Whether or not to use a color bar
- ContourOptions:
dict
Plot options to pass to contour plotting function
- PlotOptions:
dict
Plot options for the line plot
- FigureWidth:
float
Width of figure in inches
- FigureHeight:
float
Height of figure in inches
- x:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2017-04-17
@ddalle
: v1.0
- PlotHist(coeff, I, **kw)¶
Plot a histogram over several cases
- Call:
>>> h = DBi.PlotValueHist(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2016-04-04
@ddalle
: v1.0
- PlotHistBase(coeff, I, **kw)¶
Plot a histogram of one coefficient over several cases
- Call:
>>> h = DBi.PlotHistBase(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- FigureWidth:
float
Figure width
- FigureHeight:
float
Figure height
- Label: [ {comp} |
str
] Manually specified label
- Target: {
None
} |DataBookComp
|list
Target database or list thereof
- TargetValue:
float
|list
[float
] Target or list of target values
- TargetLabel:
str
|list
(str
) Legend label(s) for target(s)
- StDev: [ {None} |
float
] Multiple of iterative history standard deviation to plot
- HistOptions:
dict
Plot options for the primary histogram
- StDevOptions:
dict
Dictionary of plot options for the standard deviation plot
- DeltaOptions:
dict
Options passed to
plt.plot()
for reference range plot- MeanOptions:
dict
Options passed to
plt.plot()
for mean line- TargetOptions:
dict
Options passed to
plt.plot()
for target value lines- OutlierSigma: {
7.0
} |float
Standard deviation multiplier for determining outliers
- ShowMu:
bool
Option to print value of mean
- ShowSigma:
bool
Option to print value of standard deviation
- ShowError:
bool
Option to print value of sampling error
- ShowDelta:
bool
Option to print reference value
- ShowTarget:
bool
Option to show target value
- MuFormat: {
"%.4f"
} |str
Format for text label of the mean value
- DeltaFormat: {
"%.4f"
} |str
Format for text label of the reference value d
- SigmaFormat: {
"%.4f"
} |str
Format for text label of the iterative standard deviation
- TargetFormat: {
"%.4f"
} |str
Format for text label of the target value
- XLabel:
str
Specified label for x-axis, default is
Iteration Number
- YLabel:
str
Specified label for y-axis, default is c
- FigureWidth:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2015-05-30
@ddalle
: v1.02015-12-14
@ddalle
: Added error bars2016-04-04
@ddalle
: Moved from point sensor to data book
- PlotRangeHist(coeff, I, **kw)¶
Plot a range histogram over several cases
- Call:
>>> h = DBi.PlotRangeHist(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2016-04-04
@ddalle
: v1.0
- PlotRangeHistBase(coeff, I, **kw)¶
Plot a range histogram of one coefficient over several cases
- Call:
>>> h = DBi.PlotRangeHistBase(coeff, I, **kw)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBi:
- Keyword Arguments:
- FigureWidth:
float
Figure width
- FigureHeight:
float
Figure height
- Label: {comp} |
str
Manually specified label
- Target:
DataBookComp
|list
Target database or list thereof
- TargetValue:
float
|list
[float
] Target or list of target values
- TargetLabel:
str
|list
(str
) Legend label(s) for target(s)
- StDev: {
3.6863
} |None
|float
Multiple of iterative history standard deviation to plot
- HistOptions:
dict
Plot options for the primary histogram
- StDevOptions:
dict
Dictionary of plot options for the standard deviation plot
- DeltaOptions:
dict
Options passed to
plt.plot()
for reference range plot- TargetOptions:
dict
Options passed to
plt.plot()
for target value lines- OutlierSigma: {
3.6863
} |float
Standard deviation multiplier for determining outliers
- ShowMu:
bool
Option to print value of mean
- ShowSigma:
bool
Option to print value of standard deviation
- ShowDelta:
bool
Option to print reference value
- ShowTarget:
bool
Option to show target value
- MuFormat: {
"%.4f"
} |str
Format for text label of the mean value
- DeltaFormat: {
"%.4f"
} |str
Format for text label of the reference value d
- SigmaFormat: {
"%.4f"
} |str
Format for text label of the iterative standard deviation
- TargetFormat: {
"%.4f"
} |str
Format for text label of the target value
- XLabel:
str
Specified label for x-axis, default is
Iteration Number
- YLabel:
str
Specified label for y-axis, default is c
- FigureWidth:
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2015-05-30
@ddalle
: v1.02015-12-14
@ddalle
: Added error bars2016-04-04
@ddalle
: Moved from point sensor to data book
- ProcessColumns()¶
Process column names
- Call:
>>> DBi.ProcessColumns()
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
Data book base object
- DBi:
- Effects:
- DBi.xCols:
list
(str
) List of trajectory keys
- DBi.fCols:
list
(str
) List of floating point data columns
- DBi.iCols:
list
(str
) List of integer data columns
- DBi.cols:
list
(str
) Total list of columns
- DBi.nxCol:
int
Number of trajectory keys
- DBi.nfCol:
int
Number of floating point keys
- DBi.niCol:
int
Number of integer data columns
- DBi.nCol:
int
Total number of columns
- DBi.xCols:
- Versions:
2016-03-15
@ddalle
: v1.0
- ProcessConverters()¶
Process the list of converters to read and write each column
- Read(fname, check=False, lock=False)¶
Read a data book statistics file
- Call:
>>> DBc.Read() >>> DBc.Read(fname, check=False, lock=False)
- Inputs:
- DBc:
cape.cfdx.databook.DataBookComp
Data book base object
- fname:
str
Name of data file to read
- check:
True
| {False
} Whether or not to check LOCK status
- lock:
True
| {False
} If
True
, wait if the LOCK file exists
- DBc:
- Versions:
2015-12-04
@ddalle
: v1.02017-06-12
@ddalle
: Added lock
- abstract ReadCase()¶
Read data book
- ReadCopy(check=False, lock=False)¶
Read a copied database object
- Call:
>>> DBc1 = DBc.ReadCopy(check=False, lock=False)
- Inputs:
- DBc:
cape.cfdx.databook.DataBookComp
Data book base object
- check:
True
| {False
} Whether or not to check LOCK status
- lock:
True
| {False
} If
True
, wait if the LOCK file exists
- DBc:
- Outputs:
- DBc1:
cape.cfdx.databook.DataBookComp
Copy of data book base object
- DBc1:
- Versions:
2017-06-26
@ddalle
: v1.0
- Sort(key=None, I=None)¶
Sort a data book according to either a key or an index
- Call:
>>> DBi.Sort() >>> DBi.Sort(key) >>> DBi.Sort(I=None)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- key:
str
Name of trajectory key to use for sorting; default is first key
- I:
numpy.ndarray
[int
] List of indices; must have same size as data book
- DBi:
- Versions:
2014-12-30
@ddalle
: v1.02017-04-18
@ddalle
: Usingnp.lexsort()
- TouchLock()¶
Touch a ‘LOCK’ file for a data book component to reset its mod time
- Call:
>>> DBc.TouchLock()
- Inputs:
- DBc:
cape.cfdx.databook.DataBookBase
Data book base object
- DBc:
- Versions:
2017-06-14
@ddalle
: v1.0
- Unlock()¶
Delete the LOCK file if it exists
- Call:
>>> DBc.Unlock()
- Inputs:
- DBc:
cape.cfdx.databook.DataBookBase
Data book base object
- DBc:
- Versions:
2017-06-12
@ddalle
: v1.0
- UpdateRunMatrix()¶
Match the trajectory to the cases in the data book
- Call:
>>> DBi.UpdateRunMatrix()
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
Component data book
- DBi:
- Versions:
2017-04-18
@ddalle
: v1.0
- Write(fname=None, merge=False, unlock=True)¶
Write a single data book summary file
- Call:
>>> DBi.WriteDB() >>> DBi.WriteDB(fname, merge=False, unlock=True)
- Inputs:
- DBi:
cape.cfdx.databook.DataBookComp
An individual item data book
- fname:
str
Name of data file to read
- merge:
True
| {False
} Whether or not to attempt a merger before writing
- unlock: {
True
} |False
Whether or not to delete any lock files
- DBi:
- Versions:
2015-12-04
@ddalle
: v1.02017-06-12
@ddalle
: Added unlock2017-06-26
@ddalle
: Added merge
- mkdir(fdir)¶
Create a directory using settings from DataBook>umask
- Call:
>>> DB.mkdir(fdir)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the Cape data book class
- fdir:
str
Directory to create
- DB:
- Versions:
2017-09-05
@ddalle
: v1.0
- class cape.cfdx.databook.FMDataBook(comp, cntl, targ=None, check=False, lock=False, **kw)¶
Individual force & moment component data book
This class is derived from
cape.cfdx.databook.DataBookComp
.- Call:
>>> DBi = FMDataBookComp(comp, cntl, targ=None, **kw)
- Inputs:
- Outputs:
- DBi:
cape.cfdx.databook.FMDataBook
An individual component data book
- DBi:
- Versions:
2014-12-20
@ddalle
: Started2014-12-22
@ddalle
: v1.02016-06-27
@ddalle
: Added target option for using other folders
- GetCoeff(comp, coeff, I, **kw)¶
Get a coefficient value for one or more cases
- Call:
>>> v = DBT.GetCoeff(comp, coeff, i) >>> V = DBT.GetCoeff(comp, coeff, I)
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the Cape data book target class
- comp:
str
Component whose coefficient is being plotted
- coeff:
str
Coefficient being plotted
- i:
int
Individual case/entry index
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBT:
- Outputs:
- v:
float
Scalar value from the appropriate column
- V:
np..ndarray
Array of values from the appropriate column
- v:
- Versions:
2018-02-12
@ddalle
: v1.0
- ReadCase(comp)¶
Read a
CaseFM
object- Call:
>>> fm = DB.ReadCaseFM(comp)
- Inputs:
- DB:
cape.cfdx.databook.FMDataBook
Instance of data book class
- comp:
str
Name of component
- DB:
- Outputs:
- fm:
cape.cfdx.databook.CaseFM
Residual history class
- fm:
- Versions:
2017-04-13
@ddalle
: First separate version
- ReadCaseResid()¶
Read a
CaseResid
object- Call:
>>> H = DB.ReadCaseResid()
- Inputs:
- DB:
cape.cfdx.databook.DataBookComp
Instance of data book class
- DB:
- Outputs:
- H:
cape.cfdx.databook.CaseResid
Residual history class
- H:
- Versions:
2017-04-13
@ddalle
: First separate version
- TransformFM(topts, mask=None)¶
Transform force and moment coefficients
Available transformations and their parameters are
“Euler123”: “phi”, “theta”, “psi”
“Euler321”: “psi”, “theta”, “phi”
“ScaleCoeffs”: “CA”, “CY”, “CN”, “CLL”, “CLM”, “CLN”
Other variables (columns) in the databook are used to specify values to use for the transformation variables. For example,
topts = { "Type": "Euler321", "psi": "Psi", "theta": "Theta", "phi": "Phi", }
will cause this function to perform a reverse Euler 3-2-1 transformation using dbc[“Psi”], dbc[“Theta”], and dbc[“Phi”] as the angles.
Coefficient scaling can be used to fix incorrect reference areas or flip axes. The default is actually to flip CLL and CLN due to the transformation from CFD axes to standard flight dynamics axes.
topts = { "Type": "ScaleCoeffs", "CLL": -1.0, "CLN": -1.0, }
- Call:
>>> dbc.TransformFM(topts, mask=None)
- Inputs:
- dbc:
DataBookComp
Instance of the force and moment class
- topts:
dict
Dictionary of options for the transformation
- mask: {
None
} |np.ndarray
[int
] Optional subset of cases to transform
- dbc:
- Versions:
2021-11-18
@ddalle
: v1.0
- UpdateCaseDB(i, j, comp)¶
Update or add a case to a data book
The history of a run directory is processed if either one of three criteria are met.
The case is not already in the data book
The most recent iteration is greater than the data book value
The number of iterations used to create statistics has changed
- Call:
>>> n = DB.UpdateCaseComp(i, comp)
- Inputs:
- Outputs:
- n:
0
|1
How many updates were made
- n:
- Versions:
2014-12-22
@ddalle
: v1.02017-04-12
@ddalle
: Modified to work one component2017-04-23
@ddalle
: Added output
- class cape.cfdx.databook.PropDataBook(comp, cntl, targ=None, check=False, **kw)¶
Individual generic-property component data book
This class is derived from
cape.cfdx.databook.DataBookComp
.- Call:
>>> dbk = PropDataBook(comp, cntl, targ=None, **kw)
- Inputs:
- Outputs:
- dbk:
PropDataBook
An individual generic-property component data book
- dbk:
- Versions:
2014-12-20
@ddalle
: Started2014-12-22
@ddalle
: v1.0 (DBComp
)2016-06-27
@ddalle
: v1.12022-04-08
@ddalle
: v1.0
- ReadCase(comp)¶
Read a
CaseProp
object- Call:
>>> prop = DB.ReadCaseProp(comp)
- Inputs:
- DB:
cape.cfdx.databook.PropDataBook
Instance of data book class
- comp:
str
Name of component
- DB:
- Outputs:
- prop:
cape.cfdx.databook.CaseProp
Generic-property iterative history instance
- prop:
- Versions:
2022-04-08
@ddalle
: v1.0
- UpdateCaseDB(i, j, comp)¶
Update or add a case to a data book
The history of a run directory is processed if either one of three criteria are met.
The case is not already in the data book
The most recent iteration is greater than the data book value
The number of iterations used to create statistics has changed
- Call:
>>> n = DB.UpdateCaseComp(i, comp)
- Inputs:
- Outputs:
- n:
0
|1
How many updates were made
- n:
- Versions:
2014-12-22
@ddalle
: v1.02017-04-12
@ddalle
: Modified to work one component2017-04-23
@ddalle
: Added output
- class cape.cfdx.databook.PyFuncDataBook(comp, cntl, targ=None, check=False, **kw)¶
Individual scalar Python output component data book
This class is derived from
DataBookComp
.- Call:
>>> dbk = PyFuncDataBook(comp, x, opts, funcname, **kw)
- Inputs:
- comp:
str
Name of the component
- x:
cape.runmatrix.RunMatrix
RunMatrix for processing variable types
- opts:
cape.cfdx.options.Options
Global pyCart options instance
- funcname:
str
Name of function to execute
- targ: {
None
} |str
If used, read a duplicate data book as a target named targ
- check:
True
| {False
} Whether or not to check LOCK status
- lock:
True
| {False
} If
True
, wait if the LOCK file exists
- comp:
- Outputs:
- dbk:
PropDataBook
An individual generic-property component data book
- dbk:
- Versions:
2014-12-20
@ddalle
: Started2014-12-22
@ddalle
: v1.0 (DBComp
)2016-06-27
@ddalle
: v1.12022-04-10
@ddalle
: v1.0
- ExecPyFuncDataBook(i)¶
Execute main PyFunc function and return results
- Call:
>>> v = db.ExecPyFuncDataBook(i)
- Inputs:
- db:
PyFuncDataBook
Databook component of type
"PyFunc"
- i:
int
Run matrix case index
- db:
- Outputs:
- v:
tuple
Outputs from db.funcname in folder of case i
- v:
- Versions:
2022-04-13
@ddalle
: v1.0
- UpdateCaseDB(i, j, comp)¶
Update or add a case to a data book
The history of a run directory is processed if either one of three criteria are met.
The case is not already in the data book
The most recent iteration is greater than the data book value
The number of iterations used to create statistics has changed
- Call:
>>> n = DB.UpdateCaseComp(i, comp)
- Inputs:
- Outputs:
- n:
0
|1
How many updates were made
- n:
- Versions:
2014-12-22
@ddalle
: v1.02017-04-12
@ddalle
: Modified to work one component2017-04-23
@ddalle
: Added output
- class cape.cfdx.databook.TargetDataBook(targ, x, opts, RootDir=None)¶
Class to handle data from data book target files
There are more constraints on target files than the files that databook creates.
- Call:
>>> DBT = TargetDataBook(targ, x, opts, RootDir=None)
- Inputs:
- targ:
cape.cfdx.options.DataBook.TargetDataBook
Instance of a target source options interface
- x:
pyCart.runmatrix.RunMatrix
Run matrix interface
- opts:
cape.cfdx.options.Options
Options interface
- RootDir:
str
Root directory, defaults to
os.getcwd()
- targ:
- Outputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the Cape data book target class
- DBT:
- Versions:
2015-01-10
@ddalle
: v1.02015-12-14
@ddalle
: v1.1; add uncertainties
- CheckColumn(ctargs, pt, cf, sfx)¶
Check a data book target column name and its consistency
- Call:
>>> fi = DBT.CheckColumn(ctargs, pt, c)
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the data book target class
- ctargs:
dict
Dictionary of target column names for each coefficient
- pt:
str
Name of subcomponent (short for ‘point’)
- c:
str
Name of the coefficient in question, including suffix
- DBT:
- Outputs:
- fi:
None
|str
Name of the column in data book if present
- fi:
- Versions:
2015-12-14
@ddalle
: v1.0
- FindMatch(DBc, i)¶
Find an entry by run matrix (trajectory) variables
Cases will be considered matches by comparing variables specified in the DataBook section of
cape.json
as cases to compare against. Suppose that the control file contains the following."DataBook": { "Targets": { "Experiment": { "File": "WT.dat", "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"} "Tolerances": { "alpha": 0.05, "Mach": 0.01 } } } }
Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled MACH) and alpha to within 0.05 is considered a match. If there are more trajectory variables, they are not used for this filtering of matches.
- Call:
>>> j = DBT.FindMatch(x, i)
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the Cape data book target data carrier
- x:
cape.runmatrix.RunMatrix
The current pyCart trajectory (i.e. run matrix)
- i:
int
Index of the case from the trajectory to try match
- DBT:
- Outputs:
- j:
numpy.ndarray
[int
] Array of indices that match the trajectory within tolerances
- j:
- See also:
- Versions:
2014-12-21
@ddalle
: v1.02016-06-27
@ddalle
: v1.1; move toDataBookComp
2018-02-12
@ddalle
: v.12; First argDataBookComp
- GetCoeff(comp, coeff, I, **kw)¶
Get a coefficient value for one or more cases
- Call:
>>> v = DBT.GetCoeff(comp, coeff, i) >>> V = DBT.GetCoeff(comp, coeff, I)
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the Cape data book target class
- comp:
str
Component whose coefficient is being plotted
- coeff:
str
Coefficient being plotted
- i:
int
Individual case/entry index
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBT:
- Outputs:
- v:
float
Scalar value from the appropriate column
- V:
np..ndarray
Array of values from the appropriate column
- v:
- Versions:
2018-02-12
@ddalle
: v1.0
- PlotCoeff(comp, coeff, I, **kw)¶
Plot a sweep of one coefficient over several cases
- Call:
>>> h = DBT.PlotCoeff(comp, coeff, I, **kw)
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the Cape data book target class
- comp:
str
Component whose coefficient is being plotted
- coeff:
str
Coefficient being plotted
- I:
numpy.ndarray
[int
] List of indexes of cases to include in sweep
- DBT:
- Keyword Arguments:
- x: [ {None} |
str
] RunMatrix key for x axis (or plot against index if
None
)- Label: [ {comp} |
str
] Manually specified label
- Legend: [ {True} | False ]
Whether or not to use a legend
- StDev: [ {None} |
float
] Multiple of iterative history standard deviation to plot
- MinMax: [ {False} | True ]
Whether to plot minimum and maximum over iterative history
- Uncertainty: [ {False} | True ]
Whether to plot direct uncertainty
- PlotOptions:
dict
Plot options for the primary line(s)
- StDevOptions:
dict
Dictionary of plot options for the standard deviation plot
- MinMaxOptions:
dict
Dictionary of plot options for the min/max plot
- UncertaintyOptions:
dict
Dictionary of plot options for the uncertainty plot
- FigureWidth:
float
Width of figure in inches
- FigureHeight:
float
Height of figure in inches
- PlotTypeStDev: [ {‘FillBetween’} | ‘ErrorBar’ ]
Plot function to use for standard deviation plot
- PlotTypeMinMax: [ {‘FillBetween’} | ‘ErrorBar’ ]
Plot function to use for min/max plot
- PlotTypeUncertainty: [ ‘FillBetween’ | {‘ErrorBar’} ]
Plot function to use for uncertainty plot
- x: [ {None} |
- Outputs:
- h:
dict
Dictionary of plot handles
- h:
- Versions:
2015-05-30
@ddalle
: v1.02015-12-14
@ddalle
: Added uncertainties
- ProcessColumns()¶
Process data columns and split into dictionary keys
- Call:
>>> DBT.ProcessColumns()
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the data book target class
- DBT:
- Versions:
2015-06-03
@ddalle
: Copied from__init__()
method2015-12-14
@ddalle
: Added support for point sensors
- ReadAllData(fname, delimiter=', ', skiprows=0)¶
Read target data file all at once
- Call:
>>> DBT.ReadAllData(fname, delimiter=", ", skiprows=0)
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the Cape data book target class
- fname:
str
Name of file to read
- delimiter:
str
Data delimiter character(s)
- skiprows:
int
Number of header rows to skip
- DBT:
- Versions:
2015-09-07
@ddalle
: v1.0
- ReadData()¶
Read data file according to stored options
- Call:
>>> DBT.ReadData()
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the data book target class
- DBT:
- Versions:
2015-06-03
@ddalle
: Copied from__init__()
method
- ReadDataByColumn(fname, delimiter=', ', skiprows=0)¶
Read target data one column at a time
- Call:
>>> DBT.ReadDataByColumn(fname, delimiter=", ", skiprows=0)
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the Cape data book target class
- fname:
str
Name of file to read
- delimiter:
str
Data delimiter character(s)
- skiprows:
int
Number of header rows to skip
- DBT:
- Versions:
2015-09-07
@ddalle
: v1.0
- UpdateRunMatrix()¶
Match the trajectory to the cases in the data book
- Call:
>>> DBT.UpdateRunMatrix()
- Inputs:
- DBT:
cape.cfdx.databook.TargetDataBook
Instance of the data book target class
- DBT:
- Versions:
2015-06-03
@ddalle
: v1.0
- class cape.cfdx.databook.TimeSeriesDataBook(comp, cntl, targ=None, check=False, lock=False, **kw)¶
Individual force & moment component data book
This class is derived from
DataBookComp
.- Call:
>>> DBi = TimeSeriesDataBook(comp, cntl, **kw)
- Inputs:
- Outputs:
- DBi:
cape.cfdx.databook.FMDataBook
An individual component data book
- DBi:
- Versions:
2024-10-09
@aburkhea
: Started
- DeleteCases(I, comp)¶
Delete list of cases from data book
- Call:
>>> n = DB.Delete(I)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of the pyCart data book class
- I:
list
[int
] List of trajectory indices
- DB:
- Outputs:
- n:
int
Number of deleted entries
- n:
- Versions:
2015-03-13
@ddalle
: v1.02017-04-13
@ddalle
: Split by component
- ReadCase(comp)¶
Read a
CaseTS
object- Call:
>>> fm = DB.ReadCase(comp)
- Inputs:
- DB:
cape.cfdx.databook.DataBook
Instance of data book class
- comp:
str
Name of component
- DB:
- Outputs:
- fm:
cape.cfdx.databook.CaseFM
Residual history class
- fm:
- Versions:
2017-04-13
@ddalle
: First separate version
- ReadCaseResid()¶
Read a
CaseResid
object- Call:
>>> H = DB.ReadCaseResid()
- Inputs:
- DB:
cape.cfdx.databook.DataBookComp
Instance of data book class
- DB:
- Outputs:
- H:
cape.cfdx.databook.CaseResid
Residual history class
- H:
- Versions:
2017-04-13
@ddalle
: First separate version
- ReadCopy(check=False, lock=False)¶
Read a copied database object
- Call:
>>> DBc1 = DBc.ReadCopy(check=False, lock=False)
- Inputs:
- DBc:
cape.cfdx.databook.DataBookComp
Data book base object
- check:
True
| {False
} Whether or not to check LOCK status
- lock:
True
| {False
} If
True
, wait if the LOCK file exists
- DBc:
- Outputs:
- DBc1:
cape.cfdx.databook.DataBookComp
Copy of data book base object
- DBc1:
- Versions:
2017-06-26
@ddalle
: v1.0
- UpdateCaseDB(i, j, comp)¶
Update or add a case to a data book
The history of a run directory is processed if either one of three criteria are met.
The case is not already in the data book
The most recent iteration is greater than the data book value
The number of iterations used to create statistics has changed
- Call:
>>> n = DB.UpdateCaseComp(i, comp)
- Inputs:
- Outputs:
- n:
0
|1
How many updates were made
- n:
- Versions:
2014-12-22
@ddalle
: v1.02017-04-12
@ddalle
: Modified to work one component2017-04-23
@ddalle
: Added output
- cape.cfdx.databook.get_xlim(ha, pad=0.05)¶
Calculate appropriate x-limits to include all lines in a plot
Plotted objects in the classes
matplotlib.lines.Lines2D
are checked.
- cape.cfdx.databook.get_ylim(ha, pad=0.05)¶
Calculate appropriate y-limits to include all lines in a plot
Plotted objects in the classes
matplotlib.lines.Lines2D
andmatplotlib.collections.PolyCollection
are checked.