cape.cfdx.dataBook: CFD Data book nodule¶
cape.cfdx.dataBook: Post-processed data module¶
This module contains functions for reading and processing forces, moments, and other entities from cases in a trajectory. This module forms the core for all database post-processing in Cape, but several other database modules exist for more specific applications:
This module provides three basic classes upon which more specific data classes are developed:
The first two of these are subclassed from dict, so that
generic data can be accessed with syntax such as DB[coeff] for an
appropriately named coefficient. An outline of derived classes for
these three templates is shown below.
DBBase
DBComp: force & moment data, one comp
DBTarget: target data
DBTriqFMComp: surface CP FM for one comp
DBLineLoad: sectional load databook
DBPointSensorGroup: group of points
DBTriqPointGroup: group of surface points
DBPointSensor: one point sensor
DBTriqPoint: one surface point sensor
In addition, each solver has its own version of this module:
The parent class cape.cfdx.dataBook.DataBook provides a common
interface to all of the requested force, moment, point sensor, etc.
quantities that have been saved in the data book. Informing cape
which quantities to track, and how to statistically process them, is
done using the "DataBook" section of the JSON file, and the various
data book options are handled within the API using the
cape.cfdx.options.DataBook module.
The master data book class cape.cfdx.dataBook.DataBook is based
on the built-in dict class with keys pointing to force and
moment data books for individual components. For example, if the JSON
file tells Cape to track the forces and/or moments on a component called
"body", and the data book is the variable DB, then the forces and
moment data book is DB["body"]. This force and moment data book
contains statistically averaged forces and moments and other statistical
quantities for every case in the run matrix. The class of the force and
moment data book is cape.cfdx.dataBook.DBComp.
The data book also has the capability to store “target” data books so
that the user can compare results of the current CFD solutions to
previous results or experimental data. These are stored in
DB["Targets"] and use the cape.cfdx.dataBook.DBTarget
class. Other types of data books can also be created, such as the
cape.cfdx.pointSensor.DBPointSensor class for tracking
statistical properties at individual points in the solution field. Data
books for tracking results of groups of cases are built off of the
cape.cfdx.dataBook.DBBase class, which contains many common
tools such as plotting.
The cape.cfdx.dataBook module also contains modules for
processing results within individual case folders. This includes the
cape.cfdx.dataBook.CaseFM module for reading iterative
force/moment histories and the cape.cfdx.dataBook.CaseResid
for iterative histories of residuals.
Global data book container class¶
- class cape.cfdx.dataBook.DataBook(cntl, RootDir=None, targ=None, **kw)¶
Interface to the data book for a given CFD run matrix
- Call:
>>> DB = cape.cfdx.dataBook.DataBook(cntl, **kw)- Inputs:
- Outputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- DB.x:
cape.runmatrix.RunMatrixRun matrix of rows saved in the data book
- DB[comp]:
cape.cfdx.dataBook.DBCompComponent data book for component comp
- DB.Components:
list[str]List of force/moment components
- DB.Targets:
dictDictionary of
DBTargettarget data books- Versions:
2014-12-20
@ddalle: Started2015-01-10
@ddalle: v1.02022-03-07
@ddalle: v1.1; allow .cntl
- DeleteCaseProp(I, comp=None)¶
Delete list of cases from generic-property databook
- Call:
>>> DB.DeleteCaseProp(I)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- I:
list[int]List of trajectory indices
- comp: {
None} |list|strComponent or list of components
- Versions:
2022-04-08
@ddalle: v1.0
- DeleteCasePropComp(I, comp)¶
Delete list of cases from generic-property databook comp
- Call:
>>> n = DB.DeleteCasePropComp(I, comp)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- I:
list[int]List of trajectory indices
- comp:
strName of component
- Outputs:
- n:
intNumber of deleted entries
- Versions:
2022-04-08
@ddalle: v1.0
- DeleteCases(I, comp=None)¶
Delete list of cases from data book
- Call:
>>> DB.Delete(I)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- I:
list[int]List of trajectory indices
- comp: {
None} |list|strComponent or list of components
- Versions:
2015-03-13
@ddalle: v1.02017-04-13
@ddalle: Split by component
- DeleteCasesComp(I, comp)¶
Delete list of cases from data book
- Call:
>>> n = DB.Delete(I)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- I:
list[int]List of trajectory indices
- Outputs:
- n:
intNumber of deleted entries
- Versions:
2015-03-13
@ddalle: v1.02017-04-13
@ddalle: Split by component
- DeleteDBPyFunc(I, comp=None)¶
Delete list of cases from PyFunc databook
- Call:
>>> DB.DeleteDBPyFunc(I)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- I:
list[int]List of trajectory indices
- comp: {
None} |list|strComponent or list of components
- Versions:
2022-04-12
@ddalle: v1.0
- DeleteDBPyFuncComp(I, comp)¶
Delete list of cases from PyFunc databook comp
- Call:
>>> n = DB.DeleteDBPyFuncComp(I, comp)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- I:
list[int]List of trajectory indices
- comp:
strName of component
- Outputs:
- n:
intNumber of deleted entries
- Versions:
2022-04-12
@ddalle: v1.0
- DeleteLineLoad(I, comp=None)¶
Delete list of cases from LineLoad component data books
- Call:
>>> DB.DeleteLineLoad(I, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- I:
list[int]List of trajectory indices
- comp: {
None} |str|listComponent wild card or list of component wild cards
- Versions:
2017-04-25
@ddalle: v1.0
- DeleteLineLoadComp(comp, I=None)¶
Delete list of cases from a LineLoad component data book
- Call:
>>> n = DB.DeleteLineLoadComp(comp, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- comp:
strName of component
- I:
list[int]List of trajectory indices
- Outputs:
- n:
listNumber of deletions made
- Versions:
2017-04-25
@ddalle: v1.0
- DeleteTriqFM(I, comp=None)¶
Delete list of cases from TriqFM component data books
- Call:
>>> DB.DeleteTriqFM(I, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- I: {
None} |list[int]List or array of run matrix indices
- comp: {
None} |str|listComponent wild card or list of component wild cards
- Versions:
2017-04-25
@ddalle: v1.0
- DeleteTriqFMComp(comp, I=None)¶
Delete list of cases from a TriqFM component data book
- Call:
>>> n = DB.DeleteTriqFMComp(comp, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- comp:
strName of component
- I: {
None} |list[int]List or array of run matrix indices
- Outputs:
- n:
listNumber of deletions made
- Versions:
2017-04-25
@ddalle: v1.0
- DeleteTriqPoint(I, comp=None)¶
Delete list of cases from TriqPoint component data books
- Call:
>>> DB.DeleteTriqPoint(I, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- I: {
None} |list[int]List or array of run matrix indices
- comp: {
None} |str|listComponent wild card or list of component wild cards
- Versions:
2017-10-11
@ddalle: v1.0
- DeleteTriqPointComp(comp, I=None)¶
Delete list of cases from a TriqPoint component data book
- Call:
>>> n = DB.DeleteTriqPointComp(comp, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- comp:
strName of component
- I: {
None} |list[int]List or array of run matrix indices
- Outputs:
- n:
listNumber of deletions made
- Versions:
2017-04-25
@ddalle: v1.02017-10-11
@ddalle: FromDeleteTriqFMComp()
- FindMatch(i)¶
Find an entry by run matrix (trajectory) variables
It is assumed that exact matches can be found.
- Call:
>>> j = DB.FindMatch(i)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- i:
intIndex of the case from the trajectory to try match
- Outputs:
- j:
numpy.ndarray[int]Array of index(es) that match case i or
NaN- Versions:
2016-02-27
@ddalle: Added as a pointer to first component
- FindTargetMatch(DBT, i, topts, keylist='tol', **kw)¶
Find a target entry by run matrix (trajectory) variables
Cases will be considered matches by comparing variables specified in the topts variable, which shares some of the options from the
"Targets"subsection of the"DataBook"section ofcape.json. Suppose that topts contains the following:{ "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"} "Tolerances": { "alpha": 0.05, "Mach": 0.01 }, "Keys": ["alpha", "Mach", "beta"] }Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled
"MACH") and alpha to within 0.05 is considered a match. Because the Keys parameter contains"beta", the search will also look for exact matches in"beta".If the Keys parameter is not set, the search will use either all the keys in the trajectory, x.cols, or just the keys specified in the
"Tolerances"section of topts. Which of these two default lists to use is determined by the keylist input.
- Call:
>>> j = DB.FindTargetMatch(DBT, i, topts, **kw)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- DBT:
DBBase|DBTargetTarget component databook
- i:
intIndex of the case from the trajectory to try match
- topts:
dict|DBTargetCriteria used to determine a match
- keylist:
"x"| {"tol"}Source for default list of keys
- source: {
"self"} |"target"Match DB case i or DBT case i
- Outputs:
- j:
numpy.ndarray[int]Array of indices that match the trajectory
- See also:
- Versions:
2016-02-27
@ddalle: Added as a pointer to first component2018-02-12
@ddalle: First input x -> DBT
- GetDBMatch(j, ftarg, tol=0.0, tols=None)¶
Get index of a target match (if any) for one data book entry
- Call:
>>> i = DB.GetDBMatch(j, ftarg, tol=0.0, tols={})- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of a data book class
- j:
int|np.nanData book target index
- ftarg:
strName of the target and column
- tol:
floatTolerance for matching all keys (
0.0enforces equality)- tols:
dictDictionary of specific tolerances for each key
- Outputs:
- i:
intData book index
- Versions:
2015-08-30
@ddalle: v1.0
- GetRefComponent()¶
Get first component with type ‘FM’, ‘Force’, or ‘Moment’
- Call:
>>> DBc = DB.GetRefComponent()- Inputs:
- DB:
cape.cfdx.dataBook.DataBookData book instance
- Outputs:
- DBc:
cape.cfdx.dataBook.DBCompData book for one component
- Versions:
2016-08-18
@ddalle: v1.0
- GetTargetByName(targ)¶
Get a target handle by name of the target
- Call:
>>> DBT = DB.GetTargetByName(targ)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- targ:
strName of target to find
- Outputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the pyCart data book target class
- Versions:
2015-06-04
@ddalle: v1.0
- GetTargetMatch(i, ftarg, tol=0.0, tols=None)¶
Get index of a target match for one data book entry
- Call:
>>> j = DB.GetTargetMatch(i, ftarg, tol=0.0, tols={})- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- i:
intData book index
- ftarg:
strName of the target and column
- tol:
floatTolerance for matching all keys
- tols:
dictDictionary of specific tolerances for each key
- Outputs:
- j:
int|np.nanData book target index
- Versions:
2015-08-30
@ddalle: v1.0
- GetTargetMatches(ftarg, tol=0.0, tols={})¶
Get vectors of indices matching targets
- Call:
>>> I, J = DB.GetTargetMatches(ftarg, tol=0.0, tols={})- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- ftarg:
strName of the target and column
- tol:
floatTolerance for matching all keys
- tols:
dictDictionary of specific tolerances for each key
- Outputs:
- I:
np.ndarrayArray of data book indices with matches
- J:
np.ndarrayArray of target indices for each data book index
- Versions:
2015-08-30
@ddalle: v1.0
- MatchRunMatrix()¶
Restrict the data book object to points in the trajectory
- Call:
>>> DB.MatchRunMatrix()- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- Versions:
2015-05-28
@ddalle: v1.0
- PlotCoeff(comp, coeff, I, **kw)¶
Plot a sweep of one coefficients over several cases
- Call:
>>> h = DB.PlotCoeff(comp, coeff, I, **kw)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- comp:
strComponent whose coefficient is being plotted
- coeff:
strCoefficient being plotted
- I:
np.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- x: [ {None} |
str]RunMatrix key for x axis (else plot against index)
- Label: {comp} |
strManually specified label
- Legend: {
True} |FalseWhether or not to use a legend
- StDev: {
None} |floatMultiple of iterative history standard deviation to plot
- MinMax:
True| {False}Option to plot min and max from iterative history
- Uncertainty:
True| {False}Whether to plot direct uncertainty
- PlotOptions:
dictPlot options for the primary line(s)
- StDevOptions:
dictPlot options for the standard deviation plot
- MinMaxOptions:
dictPlot options for the min/max plot
- UncertaintyOptions:
dictDictionary of plot options for the uncertainty plot
- FigureWidth:
floatWidth of figure in inches
- FigureHeight:
floatHeight of figure in inches
- PlotTypeStDev: {
"FillBetween"} |"ErrorBar"Plot function to use for standard deviation plot
- PlotTypeMinMax: {
"FillBetween"} |"ErrorBar"Plot function to use for min/max plot
- PlotTypeUncertainty:
"FillBetween"| {"ErrorBar"}Plot function to use for uncertainty plot
- Outputs:
- h:
dictDictionary of plot handles
- See also:
- Versions:
2015-05-30
@ddalle: v1.02015-12-14
@ddalle: Added error bars
- PlotContour(comp, coeff, I, **kw)¶
Create a contour plot of one coefficient over several cases
- Call:
>>> h = DB.PlotContour(comp, coeff, I, **kw)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- comp:
strComponent whose coefficient is being plotted
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- x:
strRunMatrix key for x axis
- y:
strRunMatrix key for y axis
- ContourType: {“tricontourf”} | “tricontour” | “tripcolor”
Contour plotting function to use
- LineType: {“plot”} | “triplot” | “none”
Line plotting function to highlight data points
- Label: [ {comp} |
str]Manually specified label
- ColorBar: [ {
True} |False]Whether or not to use a color bar
- ContourOptions:
dictPlot options to pass to contour plotting function
- PlotOptions:
dictPlot options for the line plot
- FigureWidth:
floatWidth of figure in inches
- FigureHeight:
floatHeight of figure in inches
- Outputs:
- h:
dictDictionary of plot handles
- See also:
- Versions:
2015-05-30
@ddalle: v1.02015-12-14
@ddalle: Added error bars
- ProcessComps(comp=None, **kw)¶
Process list of components
This performs several conversions:
- Call:
>>> DB.ProcessComps(comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- comp: {
None} |list|strComponent or list of components
- Versions:
2017-04-13
@ddalle: v1.0
- ReadCaseFM(comp)¶
Read a
CaseFMobject
- Call:
>>> fm = DB.ReadCaseFM(comp)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp:
strName of component
- Outputs:
- fm:
cape.cfdx.dataBook.CaseFMResidual history class
- Versions:
2017-04-13
@ddalle: First separate version
- ReadCaseProp(comp)¶
Read a
CasePropobject
- Call:
>>> prop = DB.ReadCaseProp(comp)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp:
strName of component
- Outputs:
- prop:
cape.cfdx.dataBook.CasePropGeneric-property iterative history instance
- Versions:
2022-04-08
@ddalle: v1.0
- ReadCaseResid()¶
Read a
CaseResidobject
- Call:
>>> H = DB.ReadCaseResid()- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- Outputs:
- H:
cape.cfdx.dataBook.CaseResidResidual history class
- Versions:
2017-04-13
@ddalle: First separate version
- ReadDBCaseProp(comp, check=False, lock=False)¶
Initialize data book for one component
- Call:
>>> DB.InitDBComp(comp, check=False, lock=False)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- comp:
strName of component
- check:
True| {False}Whether or not to check for LOCK file
- lock:
True| {False}Whether or not to create LOCK file
- Versions:
2015-11-10
@ddalle: v1.02017-04-13
@ddalle: Self-contained and renamed
- ReadDBComp(comp, check=False, lock=False)¶
Initialize data book for one component
- Call:
>>> DB.InitDBComp(comp, check=False, lock=False)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- comp:
strName of component
- check:
True| {False}Whether or not to check for LOCK file
- lock:
True| {False}Whether or not to create LOCK file
- Versions:
2015-11-10
@ddalle: v1.02017-04-13
@ddalle: Self-contained and renamed
- ReadDBPyFunc(comp, check=False, lock=False)¶
Initialize data book for one PyFunc component
- Call:
>>> DB.ReadDBPyFunc(comp, check=False, lock=False)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pyCart data book class
- comp:
strName of component
- check:
True| {False}Whether or not to check for LOCK file
- lock:
True| {False}Whether or not to create LOCK file
- Versions:
2022-04-10
@ddalle: v1.0
- ReadLineLoad(comp, conf=None, targ=None)¶
Read a line load data
- Call:
>>> DB.ReadLineLoad(comp)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the pycart data book class
- comp:
strLine load component group
- conf: {
None} |cape.config.ConfigSurface configuration interface
- targ: {
None} |strAlternate directory to read from, else DB.targ
- Versions:
2015-09-16
@ddalle: v1.02016-06-27
@ddalle: Added targ
- ReadTarget(targ)¶
Read a data book target if it is not already present
- Call:
>>> DB.ReadTarget(targ)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- targ:
strTarget name
- Versions:
2015-09-16
@ddalle: v1.0
- ReadTriqFM(comp, check=False, lock=False)¶
Read a TriqFM data book if not already present
- Call:
>>> DB.ReadTriqFM(comp, check=False, lock=False)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookData book instance
- comp:
strName of TriqFM component
- check:
True| {False}Whether or not to check LOCK status
- lock:
True| {False}If
True, wait if the LOCK file exists- Versions:
2017-03-28
@ddalle: v1.0
- Sort(key=None, I=None)¶
Sort a data book according to either a key or an index
- Call:
>>> DB.Sort() >>> DB.Sort(key) >>> DB.Sort(I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- key:
str|list[str]Name of trajectory key or list of keys on which to sort
- I:
np.ndarray[int]List of indices; must have same size as data book
- Versions:
2014-12-30
@ddalle: v1.02015-06-19
@ddalle: New multi-key sort2016-01-13
@ddalle: Checks to allow incomplete comps
- UpdateCaseComp(i, comp)¶
Update or add a case to a data book
The history of a run directory is processed if either one of three criteria are met.
The case is not already in the data book
The most recent iteration is greater than the data book value
The number of iterations used to create statistics has changed
- Call:
>>> n = DB.UpdateCaseComp(i, comp)- Inputs:
- Outputs:
- n:
0|1How many updates were made
- Versions:
2014-12-22
@ddalle: v1.02017-04-12
@ddalle: Modified to work one component2017-04-23
@ddalle: Added output
- UpdateCaseProp(I, comp=None)¶
Update a generic-property databook
- Call:
>>> DB.UpdateCaseProp(I, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp: {
None} |strName of TriqFM data book component (default is all)
- I:
list[int]List of trajectory indices
- Versions:
2022-04-08
@ddalle: v1.0
- UpdateCasePropCase(i, comp)¶
Update or add a case to a generic-property data book
The history of a run directory is processed if either one of three criteria are met.
The case is not already in the data book
The most recent iteration is greater than the data book value
The number of iterations used to create statistics has changed
- UpdateCasePropComp(comp, I=None)¶
Update a component of the generic-property data book
- Call:
>>> DB.UpdateCasePropComp(comp, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp:
strName of TriqFM data book component
- I: {
None} |list[int]List or array of run matrix indices
- Versions:
2022-04-08
@ddalle: v1.0
- UpdateDBPyFunc(I, comp=None)¶
Update a scalar Python function output databook
- Call:
>>> DB.UpdateDBPyFunc(I, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp: {
None} |strName of TriqFM data book component (default is all)
- I:
list[int]List of trajectory indices
- Versions:
2022-04-10
@ddalle: v1.0
- UpdateDBPyFuncCase(i, comp)¶
Update or add a case to a PyFunc data book
The history of a run directory is processed if either one of three criteria are met.
The case is not already in the data book
The most recent iteration is greater than the data book value
The number of iterations used to create statistics has changed
- UpdateDBPyFuncComp(comp, I=None)¶
Update a PyFunc component of the databook
- Call:
>>> DB.UpdateDBPyFuncComp(comp, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp:
strName of TriqFM data book component
- I: {
None} |list[int]List or array of run matrix indices
- Versions:
2022-04-10
@ddalle: v1.0
- UpdateDataBook(I=None, comp=None)¶
Update the data book for a list of cases from the run matrix
- Call:
>>> DB.UpdateDataBook(I=None, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the data book class
- I:
list[int] |NoneList of trajectory indices to update
- comp: {
None} |list|strComponent or list of components
- Versions:
2014-12-22
@ddalle: v1.02017-04-12
@ddalle: Split by component
- UpdateLineLoad(I, comp=None, conf=None)¶
Update a line load data book for a list of cases
- Call:
>>> n = DB.UpdateLineLoad(I, comp=None, conf=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- I:
list[int]List of trajectory indices
- comp: {
None} |strLine load DataBook component or wild card
- Outputs:
- n:
intNumber of cases updated or added
- Versions:
2015-09-17
@ddalle: v1.02016-12-20
@ddalle: Copied tocape2017-04-25
@ddalle: Added wild cards
- UpdateLineLoadComp(comp, I=None, conf=None)¶
Update a line load data book for a list of cases
- Call:
>>> n = DB.UpdateLineLoadComp(comp, conf=None, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp:
strName of line load DataBook component
- I: {
None} |list[int]List of trajectory indices
- qpbs:
True| {False}Whether or not to submit as a script
- Outputs:
- n:
intNumber of cases updated or added
- Versions:
2015-09-17
@ddalle: v1.02016-12-20
@ddalle: Copied tocape
- UpdateRunMatrix()¶
Match the trajectory to the cases in the data book
- Call:
>>> DB.UpdateRunMatrix()- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- Versions:
2015-05-22
@ddalle: v1.0
- UpdateTriqFM(I, comp=None)¶
Update a TriqFM triangulation-extracted F&M data book
- Call:
>>> DB.UpdateTriqFM(I, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp: {
None} |strName of TriqFM data book component (default is all)
- I:
list[int]List of trajectory indices
- Versions:
2017-03-29
@ddalle: v1.0
- UpdateTriqFMComp(comp, I=None)¶
Update a TriqFM triangulation-extracted F&M data book
- Call:
>>> DB.UpdateTriqFMComp(comp, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp:
strName of TriqFM data book component
- I: {
None} |list[int]List or array of run matrix indices
- Versions:
2017-03-29
@ddalle: v1.0
- UpdateTriqPoint(I, comp=None)¶
Update a TriqPoint triangulation-extracted point sensor data book
- Call:
>>> DB.UpdateTriqPoint(I, comp=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- I:
list[int]List or array of run matrix indices
- comp: {
None} |strName of TriqPoint group or all if
None- Versions:
2017-10-11
@ddalle: v1.0
- UpdateTriqPointComp(comp, I=None)¶
Update a TriqPoint triangulation-extracted data book
- Call:
>>> n = DB.UpdateTriqPointComp(comp, I=None)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of data book class
- comp: {
None} |strName of TriqPoint group or all if
None- I: {
None} |list[int]List or array of run matrix indices
- Outputs:
- n:
intNumber of updates made
- Versions:
2017-10-11
@ddalle: v1.0
- Write(unlock=True)¶
Write the current data book in Python memory to file
- Call:
>>> DB.Write(unlock=True)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- Versions:
2014-12-22
@ddalle: v1.02015-06-19
@ddalle: New multi-key sort2017-06-12
@ddalle: Added unlock
- mkdir(fdir)¶
Create a directory using settings from DataBook>umask
- Call:
>>> DB.mkdir(fdir)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- fdir:
strDirectory to create
- Versions:
2017-09-05
@ddalle: v1.0
Individual data books¶
- class cape.cfdx.dataBook.DBBase(comp, cntl, check=False, lock=False, **kw)¶
Individual item data book basis class
- Call:
>>> DBi = DBBase(comp, cntl, check=False, lock=False)- Inputs:
- comp:
strName of the component or other item name
- cntl:
CntlCAPE control class instance
- check:
True| {False}Whether or not to check LOCK status
- lock:
True| {False}If
True, wait if the LOCK file exists- Outputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- Versions:
2014-12-22
@ddalle: v1.02015-12-04
@ddalle: Forked fromDBComp
- ArgSort(key=None)¶
Return indices that would sort a data book by a trajectory key
- Call:
>>> I = DBi.ArgSort(key=None)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- key:
strName of trajectory key to use for sorting; default is first key
- Outputs:
- I:
numpy.ndarray[int]List of indices; must have same size as data book
- Versions:
2014-12-30
@ddalle: v1.0
- CheckLock()¶
Check if lock file for this component exists
- Call:
>>> q = DBc.CheckLock()- Inputs:
- DBc:
cape.cfdx.dataBook.DataBookBaseData book base object
- Outputs:
- q:
boolWhether or not corresponding LOCK file exists
- Versions:
2017-06-12
@ddalle: v1.0
- EstimateLineCount(fname=None)¶
Get a conservative (high) estimate of the number of lines in a file
- Call:
>>> n, pos = DBP.EstimateLineCount(fname)- Inputs:
- DBP:
cape.cfdx.dataBook.DBBaseData book base object
- fname:
strName of data file to read
- Outputs:
- Versions:
2016-03-15
@ddalle: v1.0
- FindCoSweep(x, i, EqCons=[], TolCons={}, GlobCons=[], xkeys={})¶
Find data book entries meeting constraints seeded from point i
Cases will be considered matches if data book values match trajectory x point i. For example, if we have the following values for EqCons and TolCons have the following values:
EqCons = ["beta"] TolCons = {"alpha": 0.05, "mach": 0.01}Then this method will compare DBc[“mach”] to x.mach[i]. Any case such that pass all of the following tests will be included.
abs(DBc["mach"] - x.mach[i]) <= 0.01 abs(DBc["alpha"] - x.alpha[i]) <= 0.05 DBc["beta"] == x.beta[i]All entries must also meet a list of global constraints from GlobCons. Users can also use xkeys as a dictionary of alternate key names to compare to the trajectory. Consider the following values:
TolCons = {"alpha": 0.05} xkeys = {"alpha": "AOA"}Then the test becomes:
abs(DBc["AOA"] - x.alpha[i]) <= 0.05
- Call:
>>> J = DBc.FindCoSweep(x, i, EqCons={}, TolCons={}, **kw)- Inputs:
- DBc:
cape.cfdx.dataBook.DBBaseData book component instance
- x:
cape.runmatrix.RunMatrixRunMatrix (i.e. run matrix) to use for target value
- i:
intIndex of the case from the trajectory to try match
- EqCons: {
[]} |list(str)List of variables that must match the trajectory exactly
- TolCons: {
{}} |dict[float]List of variables that may match trajectory within a tolerance
- GlobCons: {
[]} |list(str)List of global constraints, see
cape.RunMatrix.Filter()- xkeys: {
{}} |dict(str)Dictionary of alternative names of variables
- Outputs:
- J:
numpy.ndarray[int]Array of indices that match the trajectory within tolerances
- See also:
- Versions:
2014-12-21
@ddalle: v1.02016-06-27
@ddalle: Moved from DBTarget and generalized
- FindDBMatch(DBc, i)¶
Find the index of an exact match to case i in another databook
- Call:
>>> j = DBi.FindDBMatch(DBc, i)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseData book base object
- DBc:
cape.cfdx.dataBook.DBBaseAnother data book base object
- i:
intData book index for DBi
- Outputs:
- j:
None|intData book index for DBj
- Versions:
2017-06-26
@ddalle: v1.0
- FindMatch(i)¶
Find an entry by run matrix (trajectory) variables
It is assumed that exact matches can be found. However, run matrix keys that do not affect the name of the folder
- Call:
>>> j = DBi.FindMatch(i)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- i:
intIndex of the case from the trajectory to try match
- Outputs:
- j:
numpy.ndarray[int]Array of index that matches the trajectory case or
NaN- Versions:
2014-12-22
@ddalle: v1.0
- FindTargetMatch(DBT, i, topts={}, keylist='tol', **kw)¶
Find a target entry by run matrix (trajectory) variables
Cases will be considered matches by comparing variables specified in the topts variable, which shares some of the options from the
"Targets"subsection of the"DataBook"section of ``cape.json`. Suppose that topts contains the following{ "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"} "Tolerances": { "alpha": 0.05, "Mach": 0.01 }, "Keys": ["alpha", "Mach", "beta"] }Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled
"MACH") and alpha to within 0.05 is considered a match. Because the Keys parameter contains"beta", the search will also look for exact matches in"beta".If the Keys parameter is not set, the search will use either all the keys in the trajectory, x.cols, or just the keys specified in the
"Tolerances"section of topts. Which of these two default lists to use is determined by the keylist input.
- Call:
>>> j = DBc.FindTargetMatch(DBT, i, topts, keylist='x', **kw)- Inputs:
- DBc:
cape.cfdx.dataBook.DBBase|DBTargetInstance of original databook
- DBT:
DBBase|DBTargetTarget databook of any type
- i:
intIndex of the case either from DBc.x for DBT.x to match
- topts:
dict|DBTargetCriteria used to determine a match
- keylist: {
"x"} |"tol"Default test key source:
x.colsortopts.Tolerances- source:
"self"| {"target"}Match DBc.x case i if
"self", else DBT.x case i- Outputs:
- j:
numpy.ndarray[int]Array of indices that match the trajectory within tolerances
- See also:
- Versions:
2014-12-21
@ddalle: v1.02016-06-27
@ddalle: Moved from DBTarget and generalized2018-02-12
@ddalle: Changed first input toDBBase
- GetCoeff(comp, coeff, I, **kw)¶
Get a coefficient value for one or more cases
- Call:
>>> v = DBT.GetCoeff(comp, coeff, i) >>> V = DBT.GetCoeff(comp, coeff, I)- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the Cape data book target class
- comp:
strComponent whose coefficient is being plotted
- coeff:
strCoefficient being plotted
- i:
intIndividual case/entry index
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Outputs:
- v:
floatScalar value from the appropriate column
- V:
np..ndarrayArray of values from the appropriate column
- Versions:
2018-02-12
@ddalle: v1.0
- GetDeltaStats(DBT, comp, coeff, I, topts={}, **kw)¶
Calculate statistics on differences between two databooks
- Call:
>>> S = DBc.GetDeltaStats(DBT, coeff, I, topts=None, **kw)- Inputs:
- DBc:
cape.cfdx.dataBook.DBBaseComponent databook
- coeff:
strName of coefficient on which to compute statistics
- I:
list[int]Indices of cases/entries to consider
- topts: {
{}} |dictDictionary of tolerances for variables in question
- keylist: {
"x"} |"tol"Default test key source:
x.colsortopts.Tolerances- CombineTarget: {
True} |FalseFor cases with multiple matches, compare to mean target value
- Outputs:
- S:
dictDictionary of statistical results
- S[“delta”]:
np.ndarrayArray of deltas for each valid case
- S[“n”]:
intNumber
- S[“mu”]:
floatMean of histogram
- Versions:
2018-02-12
@ddalle: v1.0
- GetLockFile()¶
Get the name of the potential lock file
- Call:
>>> flock = DBc.GetLockFile()- Inputs:
- DBc:
cape.cfdx.dataBook.DataBookBaseData book base object
- Outputs:
- flock:
strFull path to potential
lockfile- Versions:
2017-06-12
@ddalle: v1.0
- GetRunMatrixIndex(j)¶
Find an entry in the run matrix (trajectory)
- Call:
>>> i = DBi.GetRunMatrixIndex(self, j)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- j:
intIndex of the case from the databook to try match
- Outputs:
- i:
intRunMatrix index or
None- Versions:
2015-05-28
@ddalle: v1.0
- Lock()¶
Write a ‘LOCK’ file for a data book component
- Call:
>>> DBc.Lock()- Inputs:
- DBc:
cape.cfdx.dataBook.DataBookBaseData book base object
- Versions:
2017-06-12
@ddalle: v1.0
- Merge(DBc)¶
Merge another copy of the data book object
- Call:
>>> DBi.Merge(DBc)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseComponent data book
- DBc:
cape.cfdx.dataBook.DBBaseCopy of component data book, perhaps read at a different time
- Versions:
2017-06-26
@ddalle: v1.0
- PlotCoeff(coeff, I, **kw)¶
Plot a sweep of one coefficient over several cases
- Call:
>>> h = DBi.PlotCoeff(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2015-05-30
@ddalle: v1.02015-12-14
@ddalle: Added error bars
- PlotCoeffBase(coeff, I, **kw)¶
Plot sweep of one coefficient or quantity over several cases
This is the base method upon which data book sweep plotting is built. Other methods may call this one with modifications to the default settings. For example
cape.cfdx.dataBook.DBTarget.PlotCoeff()changes the default PlotOptions to show a red line instead of the standard black line. All settings can still be overruled by explicit inputs to either this function or any of its children.
- Call:
>>> h = DBi.PlotCoeffBase(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- x: {
None} |strRunMatrix key for x axis (or plot against index if
None)- Label: {comp} |
strManually specified label
- Legend: {
True} |FalseWhether or not to use a legend
- StDev: {
None} |floatMultiple of iterative history standard deviation to plot
- MinMax: {
False} |TrueWhether to plot minimum and maximum over iterative history
- Uncertainty: {
False} |TrueWhether to plot direct uncertainty
- PlotOptions:
dictPlot options for the primary line(s)
- StDevOptions:
dictDictionary of plot options for the standard deviation plot
- MinMaxOptions:
dictDictionary of plot options for the min/max plot
- UncertaintyOptions:
dictDictionary of plot options for the uncertainty plot
- FigureWidth:
floatWidth of figure in inches
- FigureHeight:
floatHeight of figure in inches
- PlotTypeStDev: {
'FillBetween'} |'ErrorBar'Plot function to use for standard deviation plot
- PlotTypeMinMax: {
'FillBetween'} |'ErrorBar'Plot function to use for min/max plot
- PlotTypeUncertainty:
'FillBetween'| {'ErrorBar'}Plot function to use for uncertainty plot
- LegendFontSize: {
9} |int> 0 |floatFont size for use in legends
- Grid: {
None} |True|FalseTurn on/off major grid lines, or leave as is if
None- GridStyle: {
{}} |dictDictionary of major grid line line style options
- MinorGrid: {
None} |True|FalseTurn on/off minor grid lines, or leave as is if
None- MinorGridStyle: {
{}} |dictDictionary of minor grid line line style options
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2015-05-30
@ddalle: v1.02015-12-14
@ddalle: Added error bars
- PlotContour(coeff, I, **kw)¶
Create a contour plot for a subset of cases
- Call:
>>> h = DBi.PlotContour(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2017-04-17
@ddalle: v1.0
- PlotContourBase(coeff, I, **kw)¶
Create a contour plot of selected data points
- Call:
>>> h = DBi.PlotContourBase(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- x:
strRunMatrix key for x axis
- y:
strRunMatrix key for y axis
- ContourType: {“tricontourf”} | “tricontour” | “tripcolor”
Contour plotting function to use
- LineType: {“plot”} | “triplot” | “none”
Line plotting function to highlight data points
- Label: [ {comp} |
str]Manually specified label
- ColorMap: {
"jet"} |strName of color map to use
- ColorBar: [ {
True} |False]Whether or not to use a color bar
- ContourOptions:
dictPlot options to pass to contour plotting function
- PlotOptions:
dictPlot options for the line plot
- FigureWidth:
floatWidth of figure in inches
- FigureHeight:
floatHeight of figure in inches
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2017-04-17
@ddalle: v1.0
- PlotHist(coeff, I, **kw)¶
Plot a histogram over several cases
- Call:
>>> h = DBi.PlotValueHist(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2016-04-04
@ddalle: v1.0
- PlotHistBase(coeff, I, **kw)¶
Plot a histogram of one coefficient over several cases
- Call:
>>> h = DBi.PlotHistBase(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- FigureWidth:
floatFigure width
- FigureHeight:
floatFigure height
- Label: [ {comp} |
str]Manually specified label
- Target: {
None} |DBBase|listTarget database or list thereof
- TargetValue:
float|list[float]Target or list of target values
- TargetLabel:
str|list(str)Legend label(s) for target(s)
- StDev: [ {None} |
float]Multiple of iterative history standard deviation to plot
- HistOptions:
dictPlot options for the primary histogram
- StDevOptions:
dictDictionary of plot options for the standard deviation plot
- DeltaOptions:
dictOptions passed to
plt.plot()for reference range plot- MeanOptions:
dictOptions passed to
plt.plot()for mean line- TargetOptions:
dictOptions passed to
plt.plot()for target value lines- OutlierSigma: {
7.0} |floatStandard deviation multiplier for determining outliers
- ShowMu:
boolOption to print value of mean
- ShowSigma:
boolOption to print value of standard deviation
- ShowError:
boolOption to print value of sampling error
- ShowDelta:
boolOption to print reference value
- ShowTarget:
boolOption to show target value
- MuFormat: {
"%.4f"} |strFormat for text label of the mean value
- DeltaFormat: {
"%.4f"} |strFormat for text label of the reference value d
- SigmaFormat: {
"%.4f"} |strFormat for text label of the iterative standard deviation
- TargetFormat: {
"%.4f"} |strFormat for text label of the target value
- XLabel:
strSpecified label for x-axis, default is
Iteration Number- YLabel:
strSpecified label for y-axis, default is c
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2015-05-30
@ddalle: v1.02015-12-14
@ddalle: Added error bars2016-04-04
@ddalle: Moved from point sensor to data book
- PlotRangeHist(coeff, I, **kw)¶
Plot a range histogram over several cases
- Call:
>>> h = DBi.PlotRangeHist(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2016-04-04
@ddalle: v1.0
- PlotRangeHistBase(coeff, I, **kw)¶
Plot a range histogram of one coefficient over several cases
- Call:
>>> h = DBi.PlotRangeHistBase(coeff, I, **kw)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- FigureWidth:
floatFigure width
- FigureHeight:
floatFigure height
- Label: {comp} |
strManually specified label
- Target:
DBBase|listTarget database or list thereof
- TargetValue:
float|list[float]Target or list of target values
- TargetLabel:
str|list(str)Legend label(s) for target(s)
- StDev: {
3.6863} |None|floatMultiple of iterative history standard deviation to plot
- HistOptions:
dictPlot options for the primary histogram
- StDevOptions:
dictDictionary of plot options for the standard deviation plot
- DeltaOptions:
dictOptions passed to
plt.plot()for reference range plot- TargetOptions:
dictOptions passed to
plt.plot()for target value lines- OutlierSigma: {
3.6863} |floatStandard deviation multiplier for determining outliers
- ShowMu:
boolOption to print value of mean
- ShowSigma:
boolOption to print value of standard deviation
- ShowDelta:
boolOption to print reference value
- ShowTarget:
boolOption to show target value
- MuFormat: {
"%.4f"} |strFormat for text label of the mean value
- DeltaFormat: {
"%.4f"} |strFormat for text label of the reference value d
- SigmaFormat: {
"%.4f"} |strFormat for text label of the iterative standard deviation
- TargetFormat: {
"%.4f"} |strFormat for text label of the target value
- XLabel:
strSpecified label for x-axis, default is
Iteration Number- YLabel:
strSpecified label for y-axis, default is c
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2015-05-30
@ddalle: v1.02015-12-14
@ddalle: Added error bars2016-04-04
@ddalle: Moved from point sensor to data book
- ProcessColumns()¶
Process column names
- Call:
>>> DBi.ProcessColumns()- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseData book base object
- Effects:
- DBi.xCols:
list(str)List of trajectory keys
- DBi.fCols:
list(str)List of floating point data columns
- DBi.iCols:
list(str)List of integer data columns
- DBi.cols:
list(str)Total list of columns
- DBi.nxCol:
intNumber of trajectory keys
- DBi.nfCol:
intNumber of floating point keys
- DBi.niCol:
intNumber of integer data columns
- DBi.nCol:
intTotal number of columns
- Versions:
2016-03-15
@ddalle: v1.0
- ProcessConverters()¶
Process the list of converters to read and write each column
- Read(fname=None, check=False, lock=False)¶
Read a data book statistics file
- Call:
>>> DBc.Read() >>> DBc.Read(fname, check=False, lock=False)- Inputs:
- DBc:
cape.cfdx.dataBook.DBBaseData book base object
- fname:
strName of data file to read
- check:
True| {False}Whether or not to check LOCK status
- lock:
True| {False}If
True, wait if the LOCK file exists- Versions:
2015-12-04
@ddalle: v1.02017-06-12
@ddalle: Added lock
- ReadCopy(check=False, lock=False)¶
Read a copied database object
- Call:
>>> DBc1 = DBc.ReadCopy(check=False, lock=False)- Inputs:
- DBc:
cape.cfdx.dataBook.DBBaseData book base object
- check:
True| {False}Whether or not to check LOCK status
- lock:
True| {False}If
True, wait if the LOCK file exists- Outputs:
- DBc1:
cape.cfdx.dataBook.DBBaseCopy of data book base object
- Versions:
2017-06-26
@ddalle: v1.0
- Sort(key=None, I=None)¶
Sort a data book according to either a key or an index
- Call:
>>> DBi.Sort() >>> DBi.Sort(key) >>> DBi.Sort(I=None)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- key:
strName of trajectory key to use for sorting; default is first key
- I:
numpy.ndarray[int]List of indices; must have same size as data book
- Versions:
2014-12-30
@ddalle: v1.02017-04-18
@ddalle: Usingnp.lexsort()
- TouchLock()¶
Touch a ‘LOCK’ file for a data book component to reset its mod time
- Call:
>>> DBc.TouchLock()- Inputs:
- DBc:
cape.cfdx.dataBook.DataBookBaseData book base object
- Versions:
2017-06-14
@ddalle: v1.0
- TransformDBFM(topts, mask=None)¶
Transform force and moment coefficients
Available transformations and their parameters are
“Euler123”: “phi”, “theta”, “psi”
“Euler321”: “psi”, “theta”, “phi”
“ScaleCoeffs”: “CA”, “CY”, “CN”, “CLL”, “CLM”, “CLN”
Other variables (columns) in the databook are used to specify values to use for the transformation variables. For example,
topts = { "Type": "Euler321", "psi": "Psi", "theta": "Theta", "phi": "Phi", }will cause this function to perform a reverse Euler 3-2-1 transformation using dbc[“Psi”], dbc[“Theta”], and dbc[“Phi”] as the angles.
Coefficient scaling can be used to fix incorrect reference areas or flip axes. The default is actually to flip CLL and CLN due to the transformation from CFD axes to standard flight dynamics axes.
topts = { "Type": "ScaleCoeffs", "CLL": -1.0, "CLN": -1.0, }
- Call:
>>> dbc.TransformDBFM(topts, mask=None)- Inputs:
- dbc:
DBBaseInstance of the force and moment class
- topts:
dictDictionary of options for the transformation
- mask: {
None} |np.ndarray[int]Optional subset of cases to transform
- Versions:
2021-11-18
@ddalle: v1.0
- Unlock()¶
Delete the LOCK file if it exists
- Call:
>>> DBc.Unlock()- Inputs:
- DBc:
cape.cfdx.dataBook.DataBookBaseData book base object
- Versions:
2017-06-12
@ddalle: v1.0
- UpdateRunMatrix()¶
Match the trajectory to the cases in the data book
- Call:
>>> DBi.UpdateRunMatrix()- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseComponent data book
- Versions:
2017-04-18
@ddalle: v1.0
- Write(fname=None, merge=False, unlock=True)¶
Write a single data book summary file
- Call:
>>> DBi.Write() >>> DBi.Write(fname, merge=False, unlock=True)- Inputs:
- DBi:
cape.cfdx.dataBook.DBBaseAn individual item data book
- fname:
strName of data file to read
- merge:
True| {False}Whether or not to attempt a merger before writing
- unlock: {
True} |FalseWhether or not to delete any lock files
- Versions:
2015-12-04
@ddalle: v1.02017-06-12
@ddalle: Added unlock2017-06-26
@ddalle: Added merge
- mkdir(fdir)¶
Create a directory using settings from DataBook>umask
- Call:
>>> DB.mkdir(fdir)- Inputs:
- DB:
cape.cfdx.dataBook.DataBookInstance of the Cape data book class
- fdir:
strDirectory to create
- Versions:
2017-09-05
@ddalle: v1.0
- class cape.cfdx.dataBook.DBComp(comp, cntl, targ=None, check=False, lock=False, **kw)¶
Individual force & moment component data book
This class is derived from
cape.cfdx.dataBook.DBBase.
- Call:
>>> DBi = DBComp(comp, cntl, targ=None, check=None, lock=None)- Inputs:
- Outputs:
- DBi:
cape.cfdx.dataBook.DBCompAn individual component data book
- Versions:
2014-12-20
@ddalle: Started2014-12-22
@ddalle: v1.02016-06-27
@ddalle: Added target option for using other folders
- class cape.cfdx.dataBook.DBTarget(targ, x, opts, RootDir=None)¶
Class to handle data from data book target files. There are more constraints on target files than the files that data book creates, and raw data books created by pyCart are not valid target files.
- Call:
>>> DBT = DBTarget(targ, x, opts, RootDir=None)- Inputs:
- targ:
cape.cfdx.options.DataBook.DBTargetInstance of a target source options interface
- x:
pyCart.runmatrix.RunMatrixRun matrix interface
- opts:
cape.cfdx.options.OptionsOptions interface
- RootDir:
strRoot directory, defaults to
os.getcwd()- Outputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the Cape data book target class
- Versions:
2014-12-20
@ddalle: Started2015-01-10
@ddalle: v1.02015-12-14
@ddalle: Added uncertainties
- CheckColumn(ctargs, pt, cf, sfx)¶
Check a data book target column name and its consistency
- Call:
>>> fi = DBT.CheckColumn(ctargs, pt, c)- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the data book target class
- ctargs:
dictDictionary of target column names for each coefficient
- pt:
strName of subcomponent (short for ‘point’)
- c:
strName of the coefficient in question, including suffix
- Outputs:
- fi:
None|strName of the column in data book if present
- Versions:
2015-12-14
@ddalle: v1.0
- FindMatch(DBc, i)¶
Find an entry by run matrix (trajectory) variables
Cases will be considered matches by comparing variables specified in the DataBook section of
cape.jsonas cases to compare against. Suppose that the control file contains the following."DataBook": { "Targets": { "Experiment": { "File": "WT.dat", "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"} "Tolerances": { "alpha": 0.05, "Mach": 0.01 } } } }Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled MACH) and alpha to within 0.05 is considered a match. If there are more trajectory variables, they are not used for this filtering of matches.
- Call:
>>> j = DBT.FindMatch(x, i)- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the Cape data book target data carrier
- x:
cape.runmatrix.RunMatrixThe current pyCart trajectory (i.e. run matrix)
- i:
intIndex of the case from the trajectory to try match
- Outputs:
- j:
numpy.ndarray[int]Array of indices that match the trajectory within tolerances
- See also:
- Versions:
- GetCoeff(comp, coeff, I, **kw)¶
Get a coefficient value for one or more cases
- Call:
>>> v = DBT.GetCoeff(comp, coeff, i) >>> V = DBT.GetCoeff(comp, coeff, I)- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the Cape data book target class
- comp:
strComponent whose coefficient is being plotted
- coeff:
strCoefficient being plotted
- i:
intIndividual case/entry index
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Outputs:
- v:
floatScalar value from the appropriate column
- V:
np..ndarrayArray of values from the appropriate column
- Versions:
2018-02-12
@ddalle: v1.0
- PlotCoeff(comp, coeff, I, **kw)¶
Plot a sweep of one coefficient over several cases
- Call:
>>> h = DBT.PlotCoeff(comp, coeff, I, **kw)- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the Cape data book target class
- comp:
strComponent whose coefficient is being plotted
- coeff:
strCoefficient being plotted
- I:
numpy.ndarray[int]List of indexes of cases to include in sweep
- Keyword Arguments:
- x: [ {None} |
str]RunMatrix key for x axis (or plot against index if
None)- Label: [ {comp} |
str]Manually specified label
- Legend: [ {True} | False ]
Whether or not to use a legend
- StDev: [ {None} |
float]Multiple of iterative history standard deviation to plot
- MinMax: [ {False} | True ]
Whether to plot minimum and maximum over iterative history
- Uncertainty: [ {False} | True ]
Whether to plot direct uncertainty
- PlotOptions:
dictPlot options for the primary line(s)
- StDevOptions:
dictDictionary of plot options for the standard deviation plot
- MinMaxOptions:
dictDictionary of plot options for the min/max plot
- UncertaintyOptions:
dictDictionary of plot options for the uncertainty plot
- FigureWidth:
floatWidth of figure in inches
- FigureHeight:
floatHeight of figure in inches
- PlotTypeStDev: [ {‘FillBetween’} | ‘ErrorBar’ ]
Plot function to use for standard deviation plot
- PlotTypeMinMax: [ {‘FillBetween’} | ‘ErrorBar’ ]
Plot function to use for min/max plot
- PlotTypeUncertainty: [ ‘FillBetween’ | {‘ErrorBar’} ]
Plot function to use for uncertainty plot
- Outputs:
- h:
dictDictionary of plot handles
- Versions:
2015-05-30
@ddalle: v1.02015-12-14
@ddalle: Added uncertainties
- ProcessColumns()¶
Process data columns and split into dictionary keys
- Call:
>>> DBT.ProcessColumns()- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the data book target class
- Versions:
2015-06-03
@ddalle: Copied from__init__()method2015-12-14
@ddalle: Added support for point sensors
- ReadAllData(fname, delimiter=', ', skiprows=0)¶
Read target data file all at once
- Call:
>>> DBT.ReadAllData(fname, delimiter=", ", skiprows=0)- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the Cape data book target class
- fname:
strName of file to read
- delimiter:
strData delimiter character(s)
- skiprows:
intNumber of header rows to skip
- Versions:
2015-09-07
@ddalle: v1.0
- ReadData()¶
Read data file according to stored options
- Call:
>>> DBT.ReadData()- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the data book target class
- Versions:
2015-06-03
@ddalle: Copied from__init__()method
- ReadDataByColumn(fname, delimiter=', ', skiprows=0)¶
Read target data one column at a time
- Call:
>>> DBT.ReadDataByColumn(fname, delimiter=", ", skiprows=0)- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the Cape data book target class
- fname:
strName of file to read
- delimiter:
strData delimiter character(s)
- skiprows:
intNumber of header rows to skip
- Versions:
2015-09-07
@ddalle: v1.0
- UpdateRunMatrix()¶
Match the trajectory to the cases in the data book
- Call:
>>> DBT.UpdateRunMatrix()- Inputs:
- DBT:
cape.cfdx.dataBook.DBTargetInstance of the data book target class
- Versions:
2015-06-03
@ddalle: v1.0
- class cape.cfdx.dataBook.DBTriqFM(x, opts, comp, **kw)¶
Force and moment component extracted from surface triangulation
- Call:
>>> DBF = DBTriqFM(x, opts, comp, RootDir=None)- Inputs:
- x:
cape.runmatrix.RunMatrixRunMatrix/run matrix interface
- opts:
cape.cfdx.options.OptionsOptions interface
- comp:
strName of TriqFM component
- RootDir: {
None} |stRoot directory for the configuration
- check:
True| {False}Whether or not to check LOCK status
- lock:
True| {False}If
True, wait if the LOCK file exists- Outputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Versions:
2017-03-28
@ddalle: v1.0
- ApplyTransformations(i, FM)¶
Apply transformations to forces and moments
- Call:
>>> fm = DBF.ApplyTransformations(i, FM)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- i:
intCase index
- fm:
dict(dict[float])Dictionary of force & moment coefficients
- Outputs:
- Versions:
2017-03-29
@ddalle: v1.0
- GetCompID(patch)¶
Get the component ID name(s) or number(s) to use for each patch
- Call:
>>> compID = DBF.GetCompID(patch)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- patch:
strName of patch
- Outputs:
- Versions:
2017-03-28
@ddalle: v1.0
- GetConditions(i)¶
Get the freestream conditions needed for forces
- Call:
>>> xi = DBF.GetConditions(i)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- i:
intCase index
- Outputs:
- xi:
dictDictionary of Mach number (mach), Reynolds number (Re)
- Versions:
2017-03-28
@ddalle: v1.0
- GetDimensionalForces(patch, i, FM)¶
Get dimensional forces
This dimensionalizes any force or moment coefficient already in fm replacing the first character
'C'with'F'. For example,"FA"is the dimensional axial force from"CA", and"FAv"is the dimensional axial component of the viscous force
- Call:
>>> fm = DBF.GetDimensionalForces(patch, i, FM)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- patch:
strName of patch
- i:
intCase index
- fm:
dict[float]Dictionary of force & moment coefficients
- Outputs:
- Versions:
2017-03-29
@ddalle: v1.0
- GetPatchCompIDs()¶
Get the list of component IDs mapped from the template tri
- Call:
>>> CompIDs = DBF.GetPatchCompIDs()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Outputs:
- Versions:
2017-03-30
@ddalle: v1.0
- GetRefComponent()¶
Get the first component
- Call:
>>> DBc = DBF.GetRefComponent()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Outputs:
- DBc:
cape.cfdx.dataBook.DBCompData book for one component
- Versions:
2016-08-18
@ddalle: v1.02017-04-05
@ddalle: Had to customize for TriqFM
- GetStateVars(patch, FM)¶
Get additional state variables, such as minimum Cp
- GetTriqFile()¶
Get most recent
triqfile and its associated iterations
- Call:
>>> qtriq, ftriq, n, i0, i1 = DBF.GetTriqFile()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Outputs:
- Versions:
2016-12-19
@ddalle: Added to the module
- GetTriqForces(i, **kw)¶
Get the forces, moments, and other states on each patch
- Call:
>>> fm = DBF.GetTriqForces(i)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- i:
intCase index
- Outputs:
- Versions:
2017-03-28
@ddalle: v1.0
- GetTriqForcesPatch(patch, i, **kw)¶
Get the forces and moments on a patch
- Call:
>>> fm = DBF.GetTriqForces(patch, i, **kw)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- patch:
strName of patch
- i:
intCase index
- Outputs:
- Versions:
2017-03-28
@ddalle: v1.0
- Lock()¶
Lock the data book component
- Call:
>>> DBF.Lock()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Versions:
2017-06-12
@ddalle: v1.0
- MapTriCompID()¶
Perform any component ID mapping if necessary
- Call:
>>> DBF.MapTriCompID()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Attributes:
- DBF.compmap:
dictMap of component numbers altered during the mapping
- Versions:
2017-03-28
@ddalle: v1.0
- Merge(DBF1)¶
Sort point sensor group
- Call:
>>> DBF.Merge(DBF1)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- DBF1:
cape.cfdx.dataBook.DBTriqFMAnother instance of related TriqFM data book
- Versions:
2016-06-26
@ddalle: v1.0
- PreprocessTriq(ftriq, **kw)¶
Perform any necessary preprocessing to create
triqfile
- Call:
>>> ftriq = DBF.PreprocessTriq(ftriq, qpbs=False, f=None)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- ftriq:
strName of triq file
- i: {
None} |intCase index
- Versions:
2016-12-19
@ddalle: v1.02016-12-21
@ddalle: Added PBS
- ReadCopy(check=False, lock=False)¶
Read a copied database object
- Call:
>>> DBF1 = DBF.ReadCopy(check=False, lock=False)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- check:
True| {False}Whether or not to check LOCK status
- lock:
True| {False}If
True, wait if the LOCK file exists- Outputs:
- DBF1:
cape.cfdx.dataBook.DBTriqFMAnother instance of related TriqFM data book
- Versions:
2017-06-26
@ddalle: v1.0
- ReadTriMap()¶
Read the triangulation to use for mapping
- Call:
>>> DBF.ReadTriMap()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Versions:
2017-03-28
@ddalle: v1.0
- ReadTriq(ftriq)¶
Read a
triqannotated surface triangulation
- Call:
>>> DBF.ReadTriq(ftriq)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- ftriq:
strName of
triqfile- Versions:
2017-03-28
@ddalle: v1.0
- SelectTriq()¶
Select the components of triq that are mapped patches
- Call:
>>> triq = DBF.SelectTriq()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Outputs:
- triq:
cape.tri.TriqInterface to annotated surface triangulation
- Versions:
2017-03-30
@ddalle: v1.0
- Sort()¶
Sort point sensor group
- Call:
>>> DBF.Sort()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Versions:
2016-03-08
@ddalle: v1.0
- TouchLock()¶
Touch a ‘LOCK’ file for a data book component to reset its mod time
- Call:
>>> DBF.TouchLock()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Versions:
2017-06-14
@ddalle: v1.0
- TransformFM(FM, topts, i)¶
Transform a force and moment history
Available transformations and their parameters are listed below.
“Euler321”: “psi”, “theta”, “phi”
“ScaleCoeffs”: “CA”, “CY”, “CN”, “CLL”, “CLM”, “CLN”
RunMatrix variables are used to specify values to use for the transformation variables. For example,
topts = {"Type": "Euler321", "psi": "Psi", "theta": "Theta", "phi": "Phi"}will cause this function to perform a reverse Euler 3-2-1 transformation using x.Psi[i], x.Theta[i], and x.Phi[i] as the angles.
Coefficient scaling can be used to fix incorrect reference areas or flip axes. The default is actually to flip CLL and CLN due to the transformation from CFD axes to standard flight dynamics axes.
topts = {"Type": "ScaleCoeffs", "CLL": -1.0, "CLN": -1.0}
- Call:
>>> fm.TransformFM(topts, x, i)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the force and moment class
- topts:
dictDictionary of options for the transformation
- x:
cape.runmatrix.RunMatrixThe run matrix used for this analysis
- i:
intThe index of the case to in the current run matrix
- Versions:
2014-12-22
@ddalle: v1.0
- Triq2Plt(triq, **kw)¶
Convert an annotated tri (TRIQ) interface to Tecplot (PLT)
- Call:
>>> plt = DBF.Triq2Plt(triq, **kw)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- triq:
cape.tri.TriqInterface to annotated surface triangulation
- i: {
None} |intIndex number if needed
- t: {
1.0} |floatTime step or iteration number
- Outputs:
- plt:
cape.plt.PltBinary Tecplot interface
- Versions:
2017-03-30
@ddalle: v1.0
- Unlock()¶
Unlock the data book component (delete lock file)
- Call:
>>> DBF.Unlock()- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Versions:
2017-06-12
@ddalle: v1.0
- UpdateCase(i)¶
Prepare to update a TriqFM group if necessary
- Call:
>>> n = DBF.UpdateCase(i)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- i:
intCase index
- Outputs:
- n:
0|1How many updates were made
- Versions:
2017-03-28
@ddalle: v1.0
- Write(merge=False, unlock=True)¶
Write to file each point sensor data book in a group
- Call:
>>> DBF.Write(merge=False, unlock=True)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- merge:
True| {False}Whether or not to reread data book and merge before writing
- unlock: {
True} |FalseWhether or not to delete any lock file
- Versions:
2015-12-04
@ddalle: v1.02017-06-26
@ddalle: v1.0
- WriteTriq(i, **kw)¶
Write mapped solution as TRIQ or Tecplot file with zones
- Call:
>>> DBF.WriteTriq(i, **kw)- Inputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- i:
intCase index
- t: {
1} |floatIteration number
- Versions:
2017-03-30
@ddalle: v1.0
- class cape.cfdx.dataBook.DBTriqFMComp(x, opts, comp, patch=None, **kw)¶
Force and moment component extracted from surface triangulation
- Call:
>>> DBF = DBTriqFM(x, opts, comp, RootDir=None)- Inputs:
- x:
cape.runmatrix.RunMatrixRunMatrix/run matrix interface
- opts:
cape.cfdx.options.OptionsOptions interface
- comp:
strName of TriqFM component
- RootDir: {
None} |stRoot directory for the configuration
- check:
True| {False}Whether or not to check LOCK status
- lock:
True| {False}If
True, wait if the LOCK file exists- Outputs:
- DBF:
cape.cfdx.dataBook.DBTriqFMInstance of TriqFM data book
- Versions:
2017-03-28
@ddalle: v1.0
Data book classes for individual cases¶
- class cape.cfdx.dataBook.CaseData(**kw)¶
Base class for case iterative histories
- Call:
>>> fm = CaseData()- Outputs:
- fm:
cape.cfdx.dataBook.CaseDataBase iterative history class
- Versions:
2015-12-07
@ddalle: v1.02024-01-10
@ddalle: v2.0
- ExtractValue(c: str, col=None, **kw)¶
Extract the iterative history for one coefficient/state
This function may be customized for some modules
- Call:
>>> C = fm.Extractvalue(c) >>> C = fm.ExtractValue(c, col=None)- Inputs:
- fm:
cape.cfdx.dataBook.CaseDataCase component history class
- c:
strName of state
- col: {
None} |intColumn number
- Outputs:
- C:
np.ndarrayValues for c at each iteration or sample interval
- Versions:
2015-12-07
@ddalle: v1.02024-01-10
@ddalle: v2.0, CaseFM -> DataKit
- GetIterationIndex(i: int)¶
Return index of a particular iteration in fm.i
If the iteration i is not present in the history, the index of the last available iteration less than or equal to i is returned.
- Call:
>>> j = fm.GetIterationIndex(i)- Inputs:
- fm:
cape.cfdx.dataBook.CaseDataCase component history class
- i:
intIteration number
- Outputs:
- j:
intIndex of last iteration less than or equal to i
- Versions:
2015-03-06
@ddalle: v1.0 (CaseFM)2015-12-07
@ddalle: v1.02024-01-11
@ddalle: v1.1; use keys instead of attrs
- PlotValue(c: str, col=None, n=None, **kw)¶
Plot an iterative history of some value named c
- Call:
>>> h = fm.PlotValue(c, n=None, **kw)- Inputs:
- fm:
cape.cfdx.dataBook.CaseDataCase component history class
- c:
strName of coefficient to plot, e.g.
'CA'- col:
str|int|NoneSelect a column by name or index
- n:
intOnly show the last n iterations
- nMin: {
0} |intFirst iteration allowed for use in averaging
- nAvg, nStats: {
100} |intUse at least the last nAvg iterations to compute an average
- dnAvg, dnStats: {nStats} |
intUse intervals of dnStats iterations for candidate windows
- nMax, nMaxStats: {nStats} |
intUse at most nMax iterations
- d:
floatDelta in the coefficient to show expected range
- k:
floatMultiple of iterative standard deviation to plot
- u:
floatMultiple of sampling error standard deviation to plot
- err:
floatFixed sampling error, def uses
util.SearchSinusoidFit()- nLast:
intLast iteration to use (defaults to last iteration available)
- nFirst:
intFirst iteration to plot
- FigureWidth:
floatFigure width
- FigureHeight:
floatFigure height
- PlotOptions:
dictDictionary of additional options for line plot
- StDevOptions:
dictOptions passed to
plt.fill_between()for stdev plot- ErrPltOptions:
dictOptions passed to
plt.fill_between()for uncertainty plot- DeltaOptions:
dictOptions passed to
plt.plot()for reference range plot- MeanOptions:
dictOptions passed to
plt.plot()for mean line- ShowMu:
boolOption to print value of mean
- ShowSigma:
boolOption to print value of standard deviation
- ShowError:
boolOption to print value of sampling error
- ShowDelta:
boolOption to print reference value
- MuFormat: {
"%.4f"} |strFormat for text label of the mean value
- DeltaFormat: {
"%.4f"} |strFormat for text label of the reference value d
- SigmaFormat: {
"%.4f"} |strFormat for text label of the iterative standard deviation
- ErrorFormat: {
"%.4f"} |strFormat for text label of the sampling error
- XLabel:
strSpecified label for x-axis, default is
I"teration Number"- YLabel:
strSpecified label for y-axis, default is c
- Grid: {
None} |True|FalseTurn on/off major grid lines, or leave as is if
None- GridStyle: {
{}} |dictDictionary of major grid line line style options
- MinorGrid: {
None} |True|FalseTurn on/off minor grid lines, or leave as is if
None- MinorGridStyle: {
{}} |dictDictionary of minor grid line line style options
- Ticks: {
None} |FalseTurn off ticks if
False- XTicks: {Ticks} |
None|False|listx-axis tick levels, turn off if
Falseor[]- YTicks: {Ticks} |
None|False|listy-axis tick levels, turn off if
Falseor[]- TickLabels: {
None} |FalseTurn off tick labels if
False- XTickLabels:{
None} |False|listx-axis tick labels, turn off if
Falseor[]- YTickLabels: {
None} |False|listy-axis tick labels, turn off if
Falseor[]- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2014-11-12
@ddalle: v1.02014-12-09
@ddalle: v1.1; move toAeroPlotclass2015-02-15
@ddalle: v1.2; move toAeroclass2015-03-04
@ddalle: v1.3; add nStart and nLast2015-12-07
@ddalle: v1.4; move toCaseData2017-10-12
@ddalle: v1.5; add grid and tick options2024-01-10
@ddalle: v1.6; DataKit updates
- PlotValueHist(coeff: str, nAvg=100, nLast=None, **kw)¶
Plot a histogram of the iterative history of some value c
- Call:
>>> h = fm.PlotValueHist(comp, c, n=1000, nAvg=100, **kw)- Inputs:
- fm:
cape.cfdx.dataBook.CaseDataInstance of the component force history class
- comp:
strName of component to plot
- c:
strName of coefficient to plot, e.g.
'CA'- nAvg:
intUse the last nAvg iterations to compute an average
- nBins: {
20} |intNumber of bins in histogram, also can be set in HistOptions
- nLast:
intLast iteration to use (defaults to last iteration available)
- Keyword Arguments:
- FigureWidth:
floatFigure width
- FigureHeight:
floatFigure height
- Label: [ {comp} |
str]Manually specified label
- TargetValue:
float|list[float]Target or list of target values
- TargetLabel:
str|list(str)Legend label(s) for target(s)
- StDev: [ {None} |
float]Multiple of iterative history standard deviation to plot
- HistOptions:
dictPlot options for the primary histogram
- StDevOptions:
dictDictionary of plot options for the standard deviation plot
- DeltaOptions:
dictOptions passed to
plt.plot()for reference range plot- MeanOptions:
dictOptions passed to
plt.plot()for mean line- TargetOptions:
dictOptions passed to
plt.plot()for target value lines- OutlierSigma: {
7.0} |floatStandard deviation multiplier for determining outliers
- ShowMu:
boolOption to print value of mean
- ShowSigma:
boolOption to print value of standard deviation
- ShowError:
boolOption to print value of sampling error
- ShowDelta:
boolOption to print reference value
- ShowTarget:
boolOption to show target value
- MuFormat: {
"%.4f"} |strFormat for text label of the mean value
- DeltaFormat: {
"%.4f"} |strFormat for text label of the reference value d
- SigmaFormat: {
"%.4f"} |strFormat for text label of the iterative standard deviation
- TargetFormat: {
"%.4f"} |strFormat for text label of the target value
- XLabel:
strSpecified label for x-axis, default is
Iteration Number- YLabel:
strSpecified label for y-axis, default is c
- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2015-02-15
@ddalle: v1.02015-03-06
@ddalle: v1.1; add nLast2015-03-06
@ddalle: v1.2; change class2024-01-10
@ddalle: v1.3; DataKit updates
- append_casedata(data: dict, jsrc=None, typ='raw')¶
Append data read from a single file
- Call:
>>> h.append_casedata(data, jsrc=None, typ="raw")- Inputs:
- Versions:
2024-01-22
@ddalle: v1.02024-02-21
@ddalle: v1.1; add typ
- apply_mask(mask=None, parent: str = 'i')¶
Remove subset of iterative history
- Call:
>>> h.apply_mask(mask, parent="i")- Inputs:
- h:
CaseDataSingle-case iterative history instance
- mask:
np.ndarray[bool|int]Optional mask of which cases to keep
- parent: {
"i"} |strName of iterations column
- Versions:
2024-01-22
@ddalle: v1.02024-02-20
@ddalle: v1.1; add parent
- init_empty()¶
Create empty CaseFM instance
- Call:
>>> h.init_empty()- Inputs:
- h:
CaseDataSingle-case iterative history index
- Versions:
2015-10-16
@ddalle: v1.02023-01-11
@ddalle: v2.0; DataKit updates2024-01-22
@ddalle: v2.1; use class attributes
- init_sourcefiles()¶
Initialize file name list and metadata
- Call:
>>> h.init_sourcefiles()- Inputs:
- h:
CaseDataSingle-case iterative history instance
- Versions:
2024-01-22
@ddalle: v1.02024-02-21
@ddalle: v1.1; add subiteration hooks
- process_sourcefile(fname: str)¶
Read data from a file (if necessary)
In most cases, developers will NOT nead to customize this function for each application or for each solver.
- process_subiter_sourcefile(fname: str)¶
Read data from a subiteration history file (if necessary)
In most cases, developers will NOT nead to customize this function for each application or for each solver.
- read()¶
Read iterative histroy from all sources
This will first attempt to read the cached histroy from a
.cdbfile and then ready any raw solver output files as necessary.
- Call:
>>> h.read()- Inputs:
- h:
CaseDataSingle-case iterative history instance
- Versions:
2024-01-22
@ddalle: v1.02024-02-21
@ddalle: v1.1; add subiteration hooks
- read_cdb()¶
Read contents of history from
.cdbfileSee
capefilemodule. The name of the file will bef"cape/fm_{fm.comp}.cdb".
- Call:
>>> fm.read_cdb()- Inputs:
- fm:
CaseDataIterative history instance
- Versions:
2024-01-20
@ddalle: v1.02024-01-22
@ddalle: v1.1; _special_cols check
- readfile(fname: str) dict¶
Read raw data solver file and return a dict
This method needs to be customized for each individual solver.
- readfile_lastiter(fname: str) float¶
Estimate the last iteration of a data file
The purpose of this function is to determine if the file fname needs to be read. If negative, the file is always read.
This function should be customized for each subclass. However, if it isn’t, that just means the latest raw data file written by the solver is always read.
- readfile_subiter(fname: str) dict¶
Read raw data subiteration solver file and return a dict
This method needs to be customized for each individual solver.
- class cape.cfdx.dataBook.CaseFM(comp: str, **kw)¶
Force and moment iterative histories
This class contains methods for reading data about an the histroy of an individual component for a single case. The list of available components comes from a
loadsCC.datfile if one exists.
- Call:
>>> fm = cape.cfdx.dataBook.CaseFM(C, MRP=None, A=None)- Inputs:
- Outputs:
- Versions:
2014-11-12
@ddalle: Starter version2014-12-21
@ddalle: Copied from previous aero.FM
- AddData(A: dict)¶
Add iterative force and/or moment history for a component
- Call:
>>> fm.AddData(A)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the force and moment class
- A:
numpy.ndarrayshape=(N,4) or shape=(N,7)Matrix of forces and/or moments at N iterations
- Versions:
2014-11-12
@ddalle: v1.02015-10-16
@ddalle: v2.0; complete rewrite2024-01-10
@ddalle: v2.1; simplify using DataKit
- Copy()¶
Copy an iterative force & moment history
- Call:
>>> fm2 = FM1.Copy()- Inputs:
- FM1:
cape.cfdx.dataBook.CaseFMForce and moment history
- Outputs:
- FM2:
cape.cfdx.dataBook.CaseFMCopy of FM1
- Versions:
2017-03-20
@ddalle: v1.02024-01-10
@ddalle: v2.0; simplify using DataKit
- GetStats(nStats=100, nMax=None, **kw)¶
Get mean, min, max, and stdev for all coefficients
- Call:
>>> s = fm.GetStats(nStats, nMax=None, nLast=None)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the force and moment class
- coeff:
strName of coefficient to process
- nStats: {
100} |intMin number of iterations in window to use for statistics
- dnStats: {nStats} |
intInterval size for candidate windows
- nMax: (nStats} |
intMaximum number of iterations to use for statistics
- nMin: {
0} |intFirst usable iteration number
- nLast: {fm.i[-1]} |
intLast iteration to use for statistics
- Outputs:
- Versions:
2017-09-29
@ddalle: v1.02024-01-10
@ddalle: v1.1; DataKit updates
- GetStatsCoeff(coeff, nStats=100, nMax=None, **kw)¶
Get mean, min, max, and other statistics for one coefficient
- Call:
>>> s = fm.GetStatsCoeff(coeff, nStats=100, nMax=None, **kw)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the force and moment class
- coeff:
strName of coefficient to process
- nStats: {
100} |intMin number of iterations in window to use for statistics
- dnStats: {nStats} |
intInterval size for candidate windows
- nMax: (nStats} |
intMaximum number of iterations to use for statistics
- nMin: {
0} |intFirst usable iteration number
- nLast: {fm.i[-1]} |
intLast iteration to use for statistics
- Outputs:
- Versions:
2017-09-29
@ddalle: v1.02024-01-10
@ddalle: v1.1; DataKit updates
- GetStatsN(nStats=100, nLast=None)¶
Get mean, min, max, and standard deviation for all coefficients
- Call:
>>> s = fm.GetStatsN(nStats, nLast=None)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the force and moment class
- nStats:
intNumber of iterations in window to use for statistics
- nLast:
intLast iteration to use for statistics
- Outputs:
- Versions:
2014-12-09
@ddalle: v1.02015-02-28
@ddalle: v1.1; wasGetStats()2015-03-04
@ddalle: v1.2; add nLast2024-01-10
@ddalle: v1.3; DataKit updates
- GetStatsOld(nStats=100, nMax=None, nLast=None)¶
Get mean, min, max, and standard deviation for all coefficients
- Call:
>>> s = fm.GetStatsOld(nStats, nMax=None, nLast=None)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the force and moment class
- nStats:
intMinimum number of iterations in window to use for statistics
- nMax:
intMaximum number of iterations to use for statistics
- nLast:
intLast iteration to use for statistics
- Outputs:
- Versions:
2015-02-28
@ddalle: v1.02015-03-04
@ddalle: v1.1; add nLast2024-01-10
@ddalle: v1.2; DataKit updates
- PlotCoeff(c: str, n=None, **kw)¶
Plot a single coefficient history
- Call:
>>> h = fm.PlotCoeff(c, n=1000, nAvg=100, **kw)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the component force history class
- c:
strName of coefficient to plot, e.g.
'CA'- n:
intOnly show the last n iterations
- nAvg:
intUse the last nAvg iterations to compute an average
- d:
floatDelta in the coefficient to show expected range
- nLast:
intLast iteration to use (defaults to last iteration available)
- nFirst:
intFirst iteration to plot
- FigureWidth:
floatFigure width
- FigureHeight:
floatFigure height
- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2014-11-12
@ddalle: v1.02014-12-09
@ddalle: Transferred toAeroPlot2015-02-15
@ddalle: Transferred todataBook.Aero2015-03-04
@ddalle: Added nStart and nLast2015-12-07
@ddalle: Moved content to base class
- PlotCoeffHist(c: str, nAvg=100, nBin=20, nLast=None, **kw)¶
Plot a single coefficient histogram
- Call:
>>> h = fm.PlotCoeffHist(comp, c, n=1000, nAvg=100, **kw)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the component force history class
- comp:
strName of component to plot
- c:
strName of coefficient to plot, e.g.
'CA'- nAvg:
intUse the last nAvg iterations to compute an average
- nBin:
intNumber of bins to plot
- nLast:
intLast iteration to use (defaults to last iteration available)
- FigureWidth:
floatFigure width
- FigureHeight:
floatFigure height
- Keyword arguments:
- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2015-02-15
@ddalle: v1.02015-03-06
@ddalle: Added nLast and fixed documentation2015-03-06
@ddalle: Copied toCaseFM
- ShiftMRP(Lref, x, xi=None)¶
Shift the moment reference point
- TransformFM(topts, x, i)¶
Transform a force and moment history
Available transformations and their parameters are listed below.
“Euler321”: “psi”, “theta”, “phi”
“Euler123”: “phi”, “theta”, “psi”
“ScaleCoeffs”: “CA”, “CY”, “CN”, “CLL”, “CLM”, “CLN”
RunMatrix variables are used to specify values to use for the transformation variables. For example,
topts = {"Type": "Euler321", "psi": "Psi", "theta": "Theta", "phi": "Phi"}will cause this function to perform a reverse Euler 3-2-1 transformation using x.Psi[i], x.Theta[i], and x.Phi[i] as the angles.
Coefficient scaling can be used to fix incorrect reference areas or flip axes. The default is actually to flip CLL and CLN due to the transformation from CFD axes to standard flight dynamics axes.
tops = {"Type": "ScaleCoeffs", "CLL": -1.0, "CLN": -1.0}
- Call:
>>> fm.TransformFM(topts, x, i)- Inputs:
- fm:
cape.cfdx.dataBook.CaseFMInstance of the force and moment class
- topts:
dictDictionary of options for the transformation
- x:
cape.runmatrix.RunMatrixThe run matrix used for this analysis
- i:
intRun matrix case index
- Versions:
2014-12-22
@ddalle: v1.0
- TrimIters()¶
Trim non-ascending iterations and other problems
- Call:
>>> fm.TrimIters()- Versions:
2017-10-02
@ddalle: v1.02024-01-10
@ddalle: v2.0; DataKit updates
- class cape.cfdx.dataBook.CaseResid(**kw)¶
Iterative residual history class
This class provides an interface to residuals, CPU time, and similar data for a given run directory
- Call:
>>> hist = cape.cfdx.dataBook.CaseResid()- Outputs:
- hist:
cape.cfdx.dataBook.CaseResidInstance of the run history class
- GetIterationIndex(i)¶
Return index of a particular iteration in hist.i
If the iteration i is not present in the history, the index of the last available iteration less than or equal to i is returned.
- Call:
>>> j = hist.GetIterationIndex(i)- Inputs:
- hist:
cape.cfdx.dataBook.CaseResidInstance of the residual history class
- i:
intIteration number
- Outputs:
- j:
intIndex of last iteration in fm.i less than or equal to i
- Versions:
2015-03-06
@ddalle: v1.02024-01-10
@ddalle: v1.1; DataKit updates
- GetNOrders(nStats=1, col=None)¶
Get the number of orders of magnitude of residual drop
- Call:
>>> nOrders = hist.GetNOrders(nStats=None, col=None)- Inputs:
- Outputs:
- nOrders: {
1} |intNumber of orders of magnitude of residual drop
- Versions:
2015-01-01
@ddalle: v1.02024-01-24
@ddalle: v2.0; generalize w/ DataKit apprch
- GetNOrdersUnsteady(n=1)¶
Get residual drop magnitude
- Call:
>>> nOrders = hist.GetNOrders(n=1)- Inputs:
- hist:
cape.cfdx.dataBook.CaseResidInstance of the DataBook residual history
- n:
intNumber of iterations to analyze
- Outputs:
- nOrders:
numpy.ndarray[float]Number of orders of magnitude of unsteady residual drop
- Versions:
2015-01-01
@ddalle: First version
- PlotL1(n=None, nFirst=None, nLast=None, **kw)¶
Plot the L1 residual
- Call:
>>> h = hist.PlotL1(n=None, nFirst=None, nLast=None, **kw)- Inputs:
- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2014-11-12
@ddalle: v1.02014-12-09
@ddalle: v1.1; move toAeroPlot2015-02-15
@ddalle: v1.2; move todataBook.Aero2015-03-04
@ddalle: v1.3; add nStart and nLast2015-10-21
@ddalle: v1.4; refer toPlotResid()
- PlotL2(n=None, nFirst=None, nLast=None, **kw)¶
Plot the L2 residual
- Call:
>>> h = hist.PlotL2(n=None, nFirst=None, nLast=None, **kw)- Inputs:
- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2014-11-12
@ddalle: v1.02014-12-09
@ddalle: v1.1; move toAeroPlot2015-02-15
@ddalle: v1.2; move todataBook.Aero2015-03-04
@ddalle: v1.3; add nStart and nLast2015-10-21
@ddalle: v1.4; refer toPlotResid()
- PlotLInf(n=None, nFirst=None, nLast=None, **kw)¶
Plot the L-infinity residual
- Call:
>>> h = hist.PlotLInf(n=None, nFirst=None, nLast=None, **kw)- Inputs:
- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2016-02-04
@ddalle: v1.0
- PlotResid(c='L1Resid', n=None, nFirst=None, nLast=None, **kw)¶
Plot a residual by name
- Call:
>>> h = hist.PlotResid(c='L1Resid', n=None, **kw)- Inputs:
- hist:
cape.cfdx.dataBook.CaseResidInstance of the DataBook residual history
- c:
strName of coefficient to plot
- n:
intOnly show the last n iterations
- PlotOptions:
dictPlot options for the primary line(s)
- nFirst:
intPlot starting at iteration nStart
- nLast:
intPlot up to iteration nLast
- FigureWidth:
floatFigure width
- FigureHeight:
floatFigure height
- YLabel:
strLabel for y-axis
- Outputs:
- h:
dictDictionary of figure/plot handles
- Versions:
2014-11-12
@ddalle: v1.02014-12-09
@ddalle: v1.1; move toAeroPlot2015-02-15
@ddalle: v1.2; move todataBook.Aero2015-03-04
@ddalle: v1.3; add nStart and nLast2015-10-21
@ddalle: v1.4; fromPlotL1()2022-01-28
@ddalle: v1.5; add xcol
Other cape.cfdx.dataBook methods¶
- cape.cfdx.dataBook.ImportPyPlot()¶
Import
matplotlib.pyplotif not already loaded- Call:
>>> ImportPyPlot()
- Versions:
2014-12-27
@ddalle: v1.0
- cape.cfdx.dataBook.get_xlim(ha, pad=0.05)¶
Calculate appropriate x-limits to include all lines in a plot
Plotted objects in the classes
matplotlib.lines.Lines2Dare checked.
- cape.cfdx.dataBook.get_ylim(ha, pad=0.05)¶
Calculate appropriate y-limits to include all lines in a plot
Plotted objects in the classes
matplotlib.lines.Lines2Dandmatplotlib.collections.PolyCollectionare checked.