Interface OptimizerOptions

    • Method Detail

      • getQueryDistributions

        List<QueryDistribution> getQueryDistributions()
        Determines which additional queries, if any, should be calculated by the inference engine when evaluating the fitness of a solution. For example, additional queries maybe referenced by function nodes.
      • getQueryFunctions

        List<QueryFunctionOutput> getQueryFunctions()
        Determines which additional functions, if any, should be calculated by the inference engine when evaluating the fitness of a solution. For example, functions that are referenced by a function node optimization target/objective.
      • getCausalEffectKind

        CausalEffectKind getCausalEffectKind()
        Gets the kind of causal effect to optimize. e.g. Total/Direct. Not all algorithms support direct effect calculations.
      • setCausalEffectKind

        void setCausalEffectKind​(CausalEffectKind value)
        Sets the kind of causal effect to optimize. e.g. Total/Direct. Not all algorithms support direct effect calculations.
      • getQueryLogLikelihood

        Boolean getQueryLogLikelihood()
        Determines whether the log-likelihood should be calculated by the inference engine when evaluating the fitness of a solution. For example, the log-likelihood maybe referenced by function nodes.
      • setQueryLogLikelihood

        void setQueryLogLikelihood​(Boolean value)
        Determines whether the log-likelihood should be calculated by the inference engine when evaluating the fitness of a solution. For example, the log-likelihood maybe referenced by function nodes.
      • getInferenceFactory

        InferenceFactory getInferenceFactory()
        Creates one or more inference engines used by the optimization algorithm.
      • setInferenceFactory

        void setInferenceFactory​(InferenceFactory value)
        Creates one or more inference engines used by the optimization algorithm.
      • getMaximumConcurrency

        Integer getMaximumConcurrency()
        Gets the maximum number of inference engines used during optimization. During optimization, multiple inference engines may be used in parallel. However each inference engine has its own memory requirements for inference, and so this parameter allows the number to be limited, to avoid excessive memory consumption. The amount of memory used per inference engine, depends on the Network and also the evidence.
      • getStopping

        Stop getStopping()
        Gets the instance implementing Stop used for early stopping. Stopping is different to cancellation, as stopping will still return an objective value from the optimization process, albeit having performed fewer iterations.
        Returns:
        The instance used for stopping.
      • setStopping

        void setStopping​(Stop value)
        Sets the instance implementing Stop used for early stopping. Stopping is different to cancellation, as stopping will still return an objective value from the optimization process, albeit having performed fewer iterations.
        Parameters:
        value - The instance used for stopping.