Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby) with zero dependencies

Overview

m2cgen

GitHub Actions Status Coverage Status License: MIT Python Versions PyPI Version Downloads

m2cgen (Model 2 Code Generator) - is a lightweight library which provides an easy way to transpile trained statistical models into a native code (Python, C, Java, Go, JavaScript, Visual Basic, C#, PowerShell, R, PHP, Dart, Haskell, Ruby, F#).

Installation

Supported Python version is >= 3.6.

pip install m2cgen

Supported Languages

  • C
  • C#
  • Dart
  • F#
  • Go
  • Haskell
  • Java
  • JavaScript
  • PHP
  • PowerShell
  • Python
  • R
  • Ruby
  • Visual Basic (VBA-compatible)

Supported Models

Classification Regression
Linear
  • scikit-learn
    • LogisticRegression
    • LogisticRegressionCV
    • PassiveAggressiveClassifier
    • Perceptron
    • RidgeClassifier
    • RidgeClassifierCV
    • SGDClassifier
  • lightning
    • AdaGradClassifier
    • CDClassifier
    • FistaClassifier
    • SAGAClassifier
    • SAGClassifier
    • SDCAClassifier
    • SGDClassifier
  • scikit-learn
    • ARDRegression
    • BayesianRidge
    • ElasticNet
    • ElasticNetCV
    • GammaRegressor
    • HuberRegressor
    • Lars
    • LarsCV
    • Lasso
    • LassoCV
    • LassoLars
    • LassoLarsCV
    • LassoLarsIC
    • LinearRegression
    • OrthogonalMatchingPursuit
    • OrthogonalMatchingPursuitCV
    • PassiveAggressiveRegressor
    • PoissonRegressor
    • RANSACRegressor(only supported regression estimators can be used as a base estimator)
    • Ridge
    • RidgeCV
    • SGDRegressor
    • TheilSenRegressor
    • TweedieRegressor
  • StatsModels
    • Generalized Least Squares (GLS)
    • Generalized Least Squares with AR Errors (GLSAR)
    • Generalized Linear Models (GLM)
    • Ordinary Least Squares (OLS)
    • [Gaussian] Process Regression Using Maximum Likelihood-based Estimation (ProcessMLE)
    • Quantile Regression (QuantReg)
    • Weighted Least Squares (WLS)
  • lightning
    • AdaGradRegressor
    • CDRegressor
    • FistaRegressor
    • SAGARegressor
    • SAGRegressor
    • SDCARegressor
SVM
  • scikit-learn
    • LinearSVC
    • NuSVC
    • SVC
  • lightning
    • KernelSVC
    • LinearSVC
  • scikit-learn
    • LinearSVR
    • NuSVR
    • SVR
  • lightning
    • LinearSVR
Tree
  • DecisionTreeClassifier
  • ExtraTreeClassifier
  • DecisionTreeRegressor
  • ExtraTreeRegressor
Random Forest
  • ExtraTreesClassifier
  • LGBMClassifier(rf booster only)
  • RandomForestClassifier
  • XGBRFClassifier
  • ExtraTreesRegressor
  • LGBMRegressor(rf booster only)
  • RandomForestRegressor
  • XGBRFRegressor
Boosting
  • LGBMClassifier(gbdt/dart/goss booster only)
  • XGBClassifier(gbtree(including boosted forests)/gblinear booster only)
    • LGBMRegressor(gbdt/dart/goss booster only)
    • XGBRegressor(gbtree(including boosted forests)/gblinear booster only)

    You can find versions of packages with which compatibility is guaranteed by CI tests here. Other versions can also be supported but they are untested.

    Classification Output

    Linear / Linear SVM / Kernel SVM

    Binary

    Scalar value; signed distance of the sample to the hyperplane for the second class.

    Multiclass

    Vector value; signed distance of the sample to the hyperplane per each class.

    Comment

    The output is consistent with the output of LinearClassifierMixin.decision_function.

    SVM

    Binary

    Scalar value; signed distance of the sample to the hyperplane for the second class.

    Multiclass

    Vector value; one-vs-one score for each class, shape (n_samples, n_classes * (n_classes-1) / 2).

    Comment

    The output is consistent with the output of BaseSVC.decision_function when the decision_function_shape is set to ovo.

    Tree / Random Forest / Boosting

    Binary

    Vector value; class probabilities.

    Multiclass

    Vector value; class probabilities.

    Comment

    The output is consistent with the output of the predict_proba method of DecisionTreeClassifier / ExtraTreeClassifier / ExtraTreesClassifier / RandomForestClassifier / XGBRFClassifier / XGBClassifier / LGBMClassifier.

    Usage

    Here's a simple example of how a linear model trained in Python environment can be represented in Java code:

    from sklearn.datasets import load_boston
    from sklearn import linear_model
    import m2cgen as m2c
    
    boston = load_boston()
    X, y = boston.data, boston.target
    
    estimator = linear_model.LinearRegression()
    estimator.fit(X, y)
    
    code = m2c.export_to_java(estimator)

    Generated Java code:

    public class Model {
    
        public static double score(double[] input) {
            return (((((((((((((36.45948838508965) + ((input[0]) * (-0.10801135783679647))) + ((input[1]) * (0.04642045836688297))) + ((input[2]) * (0.020558626367073608))) + ((input[3]) * (2.6867338193449406))) + ((input[4]) * (-17.76661122830004))) + ((input[5]) * (3.8098652068092163))) + ((input[6]) * (0.0006922246403454562))) + ((input[7]) * (-1.475566845600257))) + ((input[8]) * (0.30604947898516943))) + ((input[9]) * (-0.012334593916574394))) + ((input[10]) * (-0.9527472317072884))) + ((input[11]) * (0.009311683273794044))) + ((input[12]) * (-0.5247583778554867));
        }
    }

    You can find more examples of generated code for different models/languages here.

    CLI

    m2cgen can be used as a CLI tool to generate code using serialized model objects (pickle protocol):

    $ m2cgen <pickle_file> --language <language> [--indent <indent>] [--function_name <function_name>]
             [--class_name <class_name>] [--module_name <module_name>] [--package_name <package_name>]
             [--namespace <namespace>] [--recursion-limit <recursion_limit>]
    

    Don't forget that for unpickling serialized model objects their classes must be defined in the top level of an importable module in the unpickling environment.

    Piping is also supported:

    $ cat <pickle_file> | m2cgen --language <language>
    

    FAQ

    Q: Generation fails with RuntimeError: maximum recursion depth exceeded error.

    A: If this error occurs while generating code using an ensemble model, try to reduce the number of trained estimators within that model. Alternatively you can increase the maximum recursion depth with sys.setrecursionlimit(<new_depth>).

    Q: Generation fails with ImportError: No module named <module_name_here> error while transpiling model from a serialized model object.

    A: This error indicates that pickle protocol cannot deserialize model object. For unpickling serialized model objects, it is required that their classes must be defined in the top level of an importable module in the unpickling environment. So installation of package which provided model's class definition should solve the problem.

    Q: Generated by m2cgen code provides different results for some inputs compared to original Python model from which the code were obtained.

    A: Some models force input data to be particular type during prediction phase in their native Python libraries. Currently, m2cgen works only with float64 (double) data type. You can try to cast your input data to another type manually and check results again. Also, some small differences can happen due to specific implementation of floating-point arithmetic in a target language.

    Issues
    • Code generated for XGBoost models returns invalid scores when tree_method is set to

      Code generated for XGBoost models returns invalid scores when tree_method is set to "hist"

      I have trained xgboost models in Python and am using the CLI interface to convert the serialized models to pure python. However, when I use the pure python, the results differ from the predictions using the model directly.

      Python 3.7 xgboost 0.90

      My model has a large number of parameters (somewhat over 500). Here are predicted class probabilities from the original model: image

      Here are the same predicted probabilities using the generated python code via m2cgen: image

      We can see that the results are similar but not the same. The result is a significant number of cases that are moved into different classes between the two sets of predictions.

      I have also tested this with binary classification models and have the same issues.

      opened by eafpres 19
    • added support for R

      added support for R

      R is among of the most popular languages for data analysis.

      opened by StrikerRUS 16
    • Limit the number of leaves in each subroutine for gradient boosted trees

      Limit the number of leaves in each subroutine for gradient boosted trees

      Fixes https://github.com/BayesWitnesses/m2cgen/issues/103

      opened by chris-smith-zocdoc 15
    • In Java interpreter ignore subroutines and perform code split based on the AST size

      In Java interpreter ignore subroutines and perform code split based on the AST size

      After investigating possible solutions for https://github.com/BayesWitnesses/m2cgen/issues/152, I came to a conclusion that with the existing design it's extremely hard to come up with the optimal algorithm to split code into subroutines on the interpreter side (and not in assemblers). The primary reason for that is that since we always interpret one expression at a time it's hard to predict both the depth of the current subtree and the number of expressions that are left to interpret in other branches. I've achieved some progress by splitting expressions into separate subroutines based on the size of the code generated so far (i.e. code size threshold), but more often than not I'll get some stupid subroutines like this one:

      public static double subroutine2(double[] input) {
          return 22.640634908349323;
      }
      

      That's why I took a simpler approach and attempted to optimize an interpreter that caused trouble in the first place - the R one. I slightly modified its behavior: when the binary expressions count threshold is exceeded, it no longer split them into separate variable assignments, but moves them into their own subroutines. Although it might not be the most optimal way for simpler models (like linear ones), it helps tremendously with gradient boosting and random forest models. Since those models are summation of independent estimators, we end up putting every N (5 by default) estimators into their own subroutine, improving this way the execution time. @StrikerRUS please let me know what you think.

      opened by izeigerman 14
    • enhance classifiers, add PyPI badges and long description for PyPI

      enhance classifiers, add PyPI badges and long description for PyPI

      Python 3.4 reached its End of Life this spring and there is no need to support it anymore. https://devguide.python.org/devcycle/#end-of-life-branches

      opened by StrikerRUS 13
    • added possibility to write generated code into file

      added possibility to write generated code into file

      Closed #110.

      Real-life frustrating example:

      import sys
      
      from sklearn.datasets import load_boston
      
      import lightgbm as lgb
      import m2cgen as m2c
      
      X, y = load_boston(True)
      est = lgb.LGBMRegressor(n_estimators=1000).fit(X, y)
      
      sys.setrecursionlimit(1<<30)
      print(m2c.export_to_python(est))
      
      IOPub data rate exceeded.
      The notebook server will temporarily stop sending output
      to the client in order to avoid crashing it.
      To change this limit, set the config variable
      `--NotebookApp.iopub_data_rate_limit`.
      
      Current values:
      NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec)
      NotebookApp.rate_limit_window=3.0 (secs)
      

      m2c.export_to_python(est, 'test.txt') works fine in this scenario.

      opened by StrikerRUS 12
    • Dart language support

      Dart language support

      For those building Flutter apps that would like to be able to utilize static models trained in scikit on-device, this tool would be a perfect fit. And if the Flutter dev team decides to add a hot code push feature to the framework, models from m2cgen could be updated on the fly.

      opened by mattc-eostar 11
    • added support for PowerShell

      added support for PowerShell

      With this PR Windows users will be able to execute ML models from "command line" without the need to install any programming language (PowerShell is already installed in Windows).

      opened by StrikerRUS 11
    • Code generated from XGBoost model includes

      Code generated from XGBoost model includes "None"

      When transpiling XGBRegressor and XGBClassifier models such as the following basic example:

      from xgboost import XGBRegressor
      from sklearn import datasets
      import m2cgen as m2c
      
      iris_data = datasets.load_iris(return_X_y=True)
      
      mod = XGBRegressor(booster="gblinear", max_depth=2)
      X, y = iris_data
      mod.fit(X[:120], y[:120])
      
      code = m2c.export_to_c(mod)
      
      print(code)
      

      the resulting c-code includes a Pythonesque None :

      double score(double * input) {
          return (None) + (((((-0.391196) + ((input[0]) * (-0.0196191))) + ((input[1]) * (-0.11313))) + ((input[2]) * (0.137024))) + ((input[3]) * (0.645197)));
      }
      

      Probably I am missing some basic step?

      opened by robinvanemden 10
    • fix typo

      fix typo

      The usage example is for python

      opened by Alisa-lisa 10
    • Bump scipy from 1.7.2 to 1.7.3

      Bump scipy from 1.7.2 to 1.7.3

      Bumps scipy from 1.7.2 to 1.7.3.

      Release notes

      Sourced from scipy's releases.

      SciPy 1.7.3 Release Notes

      SciPy 1.7.3 is a bug-fix release that provides binary wheels for MacOS arm64 with Python 3.8, 3.9, and 3.10. The MacOS arm64 wheels are only available for MacOS version 12.0 and greater, as explained in Issue 14688.

      Authors

      • Anirudh Dagar
      • Ralf Gommers
      • Tyler Reddy
      • Pamphile Roy
      • Olivier Grisel
      • Isuru Fernando

      A total of 6 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete.

      Commits
      • 59e6539 REL: 1.7.3 release commit.
      • 9fcfc45 Merge pull request #15088 from tylerjereddy/treddy_backports_173
      • 7275c59 DOC: update 1.7.3 relnotes.
      • 20c42c2 MAINT: PR 15088 revisions
      • 759f76e DOC: draft 1.7.3 release notes
      • 4f14791 BUG: out of bounds indexing in stats.qmc.update_discrepancy.
      • 1b9e907 Merge pull request #15090 from rgommers/arm64-17x
      • d75f647 TST: skip failing integrate tests on macOS arm64 - zvode issue
      • b785d56 TST: skip a failing arpack test on macOS arm64
      • 526e6c4 TST: mark an mpmath test that appears to hang as xslow
      • Additional commits viewable in compare view

      Dependabot compatibility score

      Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


      Dependabot commands and options

      You can trigger Dependabot actions by commenting on this PR:

      • @dependabot rebase will rebase this PR
      • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
      • @dependabot merge will merge this PR after your CI passes on it
      • @dependabot squash and merge will squash and merge this PR after your CI passes on it
      • @dependabot cancel merge will cancel a previously requested merge and block automerging
      • @dependabot reopen will reopen this PR if it is closed
      • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
      • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
      • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
      • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
      dependencies 
      opened by dependabot[bot] 0
    • Bump xgboost from 1.4.2 to 1.5.1

      Bump xgboost from 1.4.2 to 1.5.1

      Bumps xgboost from 1.4.2 to 1.5.1.

      Release notes

      Sourced from xgboost's releases.

      1.5.1 Patch Release

      This is a patch release for compatibility with the latest dependencies and bug fixes. Also, all GPU-compatible binaries are built with CUDA 11.0.

      • [Python] Handle missing values in dataframe with category dtype. (#7331)

      • [R] Fix R CRAN failures about prediction and some compiler warnings.

      • [JVM packages] Fix compatibility with latest Spark (#7438, #7376)

      • Support building with CTK11.5. (#7379)

      • Check user input for iteration in inplace predict.

      • Handle OMP_THREAD_LIMIT environment variable.

      • [doc] Fix broken links. (#7341)

      Artifacts

      You can verify the downloaded packages by running this on your Unix shell:

      echo "<hash> <artifact>" | shasum -a 256 --check
      
      3a6cc7526c0dff1186f01b53dcbac5c58f12781988400e2d340dda61ef8d14ca  xgboost_r_gpu_linux_afb9dfd4210e8b8db8fe03380f83b404b1721443.tar.gz
      6f74deb62776f1e2fd030e1fa08b93ba95b32ac69cc4096b4bcec3821dd0a480  xgboost_r_gpu_win64_afb9dfd4210e8b8db8fe03380f83b404b1721443.tar.gz
      565dea0320ed4b6f807dbb92a8a57e86ec16db50eff9a3f405c651d1f53a259d  xgboost.tar.gz
      

      Release 1.5.0 stable

      This release comes with many exciting new features and optimizations, along with some bug fixes. We will describe the experimental categorical data support and the external memory interface independently. Package-specific new features will be listed in respective sections.

      Development on categorical data support

      In version 1.3, XGBoost introduced an experimental feature for handling categorical data natively, without one-hot encoding. XGBoost can fit categorical splits in decision trees. (Currently, the generated splits will be of form x \in {v}, where the input is compared to a single category value. A future version of XGBoost will generate splits that compare the input against a list of multiple category values.)

      Most of the other features, including prediction, SHAP value computation, feature importance, and model plotting were revised to natively handle categorical splits. Also, all Python interfaces including native interface with and without quantized DMatrix, scikit-learn interface, and Dask interface now accept categorical data with a wide range of data structures support including numpy/cupy array and cuDF/pandas/modin dataframe. In practice, the following are required for enabling categorical data support during training:

      • Use Python package.
      • Use gpu_hist to train the model.
      • Use JSON model file format for saving the model.

      Once the model is trained, it can be used with most of the features that are available on

      ... (truncated)

      Changelog

      Sourced from xgboost's changelog.

      XGBoost Change Log

      This file records the changes in xgboost library in reverse chronological order.

      v1.5.0 (2021 Oct 11)

      This release comes with many exciting new features and optimizations, along with some bug fixes. We will describe the experimental categorical data support and the external memory interface independently. Package-specific new features will be listed in respective sections.

      Development on categorical data support

      In version 1.3, XGBoost introduced an experimental feature for handling categorical data natively, without one-hot encoding. XGBoost can fit categorical splits in decision trees. (Currently, the generated splits will be of form x \in {v}, where the input is compared to a single category value. A future version of XGBoost will generate splits that compare the input against a list of multiple category values.)

      Most of the other features, including prediction, SHAP value computation, feature importance, and model plotting were revised to natively handle categorical splits. Also, all Python interfaces including native interface with and without quantized DMatrix, scikit-learn interface, and Dask interface now accept categorical data with a wide range of data structures support including numpy/cupy array and cuDF/pandas/modin dataframe. In practice, the following are required for enabling categorical data support during training:

      • Use Python package.
      • Use gpu_hist to train the model.
      • Use JSON model file format for saving the model.

      Once the model is trained, it can be used with most of the features that are available on the Python package. For a quick introduction, see https://xgboost.readthedocs.io/en/latest/tutorials/categorical.html

      Related PRs: (#7011, #7001, #7042, #7041, #7047, #7043, #7036, #7054, #7053, #7065, #7213, #7228, #7220, #7221, #7231, #7306)

      • Next steps

        • Revise the CPU training algorithm to handle categorical data natively and generate categorical splits
        • Extend the CPU and GPU algorithms to generate categorical splits of form x \in S where the input is compared with multiple category values. split. (#7081)

      External memory

      This release features a brand-new interface and implementation for external memory (also known as out-of-core training). (#6901, #7064, #7088, #7089, #7087, #7092, #7070, #7216). The new implementation leverages the data iterator interface, which is currently used to create DeviceQuantileDMatrix. For a quick introduction, see https://xgboost.readthedocs.io/en/latest/tutorials/external_memory.html#data-iterator . During the development of this new interface, lz4 compression is removed. (#7076).

      ... (truncated)

      Commits

      Dependabot compatibility score

      Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


      Dependabot commands and options

      You can trigger Dependabot actions by commenting on this PR:

      • @dependabot rebase will rebase this PR
      • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
      • @dependabot merge will merge this PR after your CI passes on it
      • @dependabot squash and merge will squash and merge this PR after your CI passes on it
      • @dependabot cancel merge will cancel a previously requested merge and block automerging
      • @dependabot reopen will reopen this PR if it is closed
      • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
      • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
      • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
      • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
      dependencies 
      opened by dependabot[bot] 0
    • Add support for multi-label classification of DecisionTreeClassifier(sklearn)

      Add support for multi-label classification of DecisionTreeClassifier(sklearn)

      This repo is awesome! It is very useful. However, I find it currently do not support for multi-label classification of DecisionTreeClassifier(sklearn). I will appreciate if that feature could be implemented.

      opened by HannanKan 1
    • invalid input value index

      invalid input value index

      ··

      opened by mccoysc 0
    • Support CalibratedClassifierCV /  CalibratedClassifier

      Support CalibratedClassifierCV / CalibratedClassifier

      https://scikit-learn.org/stable/modules/generated/sklearn.calibration.CalibratedClassifierCV.html

      it seens to be relative easy (a sigmoid function + mean)

      https://github.com/scikit-learn/scikit-learn/blob/9b033758e/sklearn/calibration.py#L388 https://github.com/scikit-learn/scikit-learn/blob/9b033758e/sklearn/calibration.py#L433

      opened by rspadim 0
    • Convert from VBA function to SAS

      Convert from VBA function to SAS

      Hello all, I have tried to use the m2cgen package in order to translate 3 specific VBA functions to SAS or R scripts. I'm sending as an attached file an Excel spreasheet that illustrates what I'e tried to accomplish. There are 3 different functions available over this Excel spreadsheet, for instance: a) COHORT: Function used to calculated a transition matrix through Cohort approach b) GENERATOR: Function used to calculated the generator matrix through Aalen-Johansen Estimator c) MEXPGENERATOR: Function used to translate the generator matrix in probabilities

      I'd be so grateful if you are able to help me on that.

      Thanks in advance. Best Regards, Raphael

      Transition Matrix_VBA Functions.zip

      opened by raphaelchaves 0
    • taking address of temporary array

      taking address of temporary array

      XGBoost Classifier generated code for objective = 'binary:logistic', does not compile in c++

      The last line that copies the prediction to what i am assuming to be a 0 or 1 sigmoid output value.

      memcpy(output, (const double[]){(1.0) - (var1000), var1000}, 2 * sizeof(double));
      

      gives this error:

      error: taking address of temporary array
           memcpy(output, (double[]){(1.0) - (var1000), var1000}, 2 * sizeof(double));
                                          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
      

      it highlights this part:

      {(1.0) - (var1000), var1000}
      

      does anyone know how I can fix this error? And what if I just returned the raw values from var1000 could that work also?

      Compiler:

      g++ (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
      

      CXX_FLAGS

      -std=c++11
      
      opened by jdubz93 5
    • Any limit in export_to_c ? I am getting some python crash with large XGBoost model

      Any limit in export_to_c ? I am getting some python crash with large XGBoost model

      xgb is my trained XGBoost model with .fit method The model is rather large with 3000 trees and depth of 8

      I save the model as a txt file and it is rather large 28.5 MB using xgb.get_booster().dump_model('xgb_model_rules.txt', with_stats = True)

      then I want to convert to some c code (I am trying some post-editing to make it a MATLAB code)

      code = m2c.export_to_c(xgb) print("write the code to a text file") text_file = open("xgb_c.txt", "w") text_file.write("C code for the XGB structure: %s" % code) text_file.close()

      my python (version 3.8.10) crashes during code = m2c.export_to_c(xgb) ...

      Any suggestions ? I tried with a smaller boston data set resulting in a smaller model and it works fine.

      I can share the xgb_model_rules.txt if needed

      opened by bhamadicharef 8
    • Add export to MATLAB

      Add export to MATLAB

      Based on the export to C, I think the MATLAB one should not be too hard.

      enhancement 
      opened by bhamadicharef 0
    • Add support for Elixir

      Add support for Elixir

      null

      opened by lucasavila00 6
    Releases(v0.9.0)
    • v0.9.0(Sep 18, 2020)

      • Python 3.5 is no longer supported.
      • Trained models can now be transpiled into F# 🎉 .
      • Model support:
        • Added support for GLM models from the scikit-learn package.
        • Introduced support for a variety of objectives in LightGBM models.
        • The cauchy function is now supported for GLM models.
      • Improved conversion of floating point numbers into string literals. This leads to improved accuracy of results returned by generated code.
      • Improved handling of missing values in LightGBM models. Kudos to our first time contributor @Aulust 🎉
      • Various improvements of the code generation runtime.
      Source code(tar.gz)
      Source code(zip)
    • v0.8.0(Jun 18, 2020)

      • This release is the last one which supports Python 3.5. Next release will require Python >= 3.6.
      • Trained models can now be transpiled into Haskell and Ruby 🎉
      • Various improvements of the code generation runtime:
        • Introduced caching of the interpreter handler names.
        • A string buffer is now used to store generated code.
        • We moved away from using the string.Template.
      • The numpy dependency is no longer required at runtime for the generated Python code.
      • Improved model support:
        • Enabled multiclass support for XGBoost Random Forest models.
        • Added support of Boosted Random Forest models from the XGBoost package.
        • Added support of GLM models from the statsmodels package.
      • Introduced fallback expressions for a variety of functions which rely on simpler language constructs. This should simplify implementation of new interpreters since the number of functions that must be provided by the standard library or by a developer of the given interpreter has been reduced. Note that fallback expressions are optional and can be overridden by a manually written implementation or a corresponding function from the standard library. Among functions for which fallback AST expressions have been introduced are: abs, tanh, sqrt, exp, sigmoid and softmax.

      Kudos to @StrikerRUS who's responsible for all these amazing updates 💪

      Source code(tar.gz)
      Source code(zip)
    • v0.7.0(Apr 7, 2020)

      • Bug fixes:
        • Thresholds for XGBoost trees are forced to be float32 now (https://github.com/BayesWitnesses/m2cgen/issues/168).
        • Fixed support for newer versions of XGBoost, in which the default value for the base_score parameter became None (https://github.com/BayesWitnesses/m2cgen/issues/182).
      • Models can now be transpiled into the Dart language. Kudos to @MattConflitti for this great addition 🎉
      • Support for following models has been introduced:
        • Models from the statsmodels package are now supported. The list of added models includes: GLS, GLSAR, OLS, ProcessMLE, QuantReg and WLS.
        • Models from the lightning package: AdaGradRegressor/AdaGradClassifier, CDRegressor/CDClassifier, FistaRegressor/FistaClassifier, SAGARegressor/SAGAClassifier, SAGRegressor/SAGClassifier, SDCARegressor/SDCAClassifier, SGDClassifier, LinearSVR/LinearSVC and KernelSVC.
        • RANSACRegressor from the scikit-learn package.
      • The name of the scoring function can now be changed via a parameter. Thanks @mrshu 💪
      • The SubroutineExpr expression has been removed from AST. The logic of how to split the generated code into subroutines is now focused in interpreters and was completely removed from assemblers.
      Source code(tar.gz)
      Source code(zip)
    • v0.6.0(Feb 16, 2020)

      • Trained models can now be transpiled into R, PowerShell and PHP. Major effort delivered solely by @StrikerRUS .
      • In Java interpreter introduced a logic that splits code into methods that is based on heuristics and which doesn't rely on SubroutineExpr from AST.
      • Added support of LightGBM and XGBoost Random Forest models.
      • XGBoost linear models are now supported.
      • LassoLarsCV, Perceptron and PassiveAggressiveClassifier estimators from scikit-learn package are now supported.
      Source code(tar.gz)
      Source code(zip)
    • v0.5.0(Dec 1, 2019)

      Quite a few awesome updates in this release. Many thanks to @StrikerRUS and @chris-smith-zocdoc for making this release happen.

      • Visual Basic and C# joined the list of supported languages. Thanks @StrikerRUS for all the hard work!
      • The numpy dependency is no longer required for generated Python code when no linear algebra is involved. Thanks @StrikerRUS for this update.
      • Fixed the bug when generated Java code exceeded the JVM method size constraints in case when individual estimators of a GBT model contained a large number of leaves. Kudos to @chris-smith-zocdoc for discovering and fixing this issue.
      Source code(tar.gz)
      Source code(zip)
    • v0.4.0(Sep 28, 2019)

    • v0.3.1(Aug 15, 2019)

      • Fixed generation of XGBoost models in case when feature names are not specified in a model object (https://github.com/BayesWitnesses/m2cgen/pull/93). Thanks @akhvorov for contributing the fix.
      Source code(tar.gz)
      Source code(zip)
    • v0.3.0(May 21, 2019)

    • v0.2.1(Apr 17, 2019)

      • For XGBoost models add support of the best_ntree_limit attribute to limit the number of estimators used during prediction. Thanks @arshamg for helping with that.
      Source code(tar.gz)
      Source code(zip)
    • v0.2.0(Mar 22, 2019)

      • Golang joins the family of languages supported by m2cgen 🎉 Credit goes to @matbur for making such a significant contribution 🥇
      • For generated C code the custom assign_array function that was used to assign vector values has been replaced with plain memcpy.
      Source code(tar.gz)
      Source code(zip)
    • v0.1.1(Mar 4, 2019)

    • v0.1.0(Feb 12, 2019)

    Owner
    Bayes' Witnesses
    Bayes' Witnesses
    A project to showcase usage of basic principles to convert any 3D design into a working application using Flutter.

    Developing apps with 3D designs in flutter This project is developed to showcase how we can use some basic principles to convert any 3D design into a

    Manas Pratap Thakur 20 Nov 9, 2021
    Flutter: Animation Series || Episode 1 || Basic Animation || Episode 1 || Basic Animation

    animationseries A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you started

    Pawan Kumar 21 Nov 4, 2021
    MXFlutter 是一套使用 TypeScript/JavaScript 来开发 Flutter 应用的框架

    MXFlutter 是一套使用 TypeScript/JavaScript 来开发 Flutter 应用的框架。 框架支持两种开发方式 基于 mxflutter-js 前端框架,使用 TypeScript 语言,以类似 Flutter 的 Widget 组

    Tencent 440 Nov 30, 2021
    Flutter Learn Series zero to end

    flutter_learn A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you started if

    Veli Bacik 20 Nov 15, 2021
    Basic Dart reverse shell code

    dart_rs Basic Dart reverse shell based on this one by Potato-Industries. Pretty self explanatory. You’ll need Windows. I used a Windows 7 64-bit VM. F

    null 20 Jul 16, 2021
    a python-like bytes_io implementation for dart

    bytes_io A python-like bytes_io implementation for dart A powerful helper for processing raw binary data Usage A simple usage example: import 'package

    Kelly 5 Oct 8, 2021
    A bugless implementation of BigDecimal in Dart based on Java's BigDecimal

    big_decimal A bugless implementation of BigDecimal in Dart based on Java's BigDecimal Installation Add the following to your pubspec.yaml: dependencie

    null 10 Sep 27, 2021
    Flutter Advanced: Auto Create Models from JSON | Serializable | Serializable

    flutterautomodel A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you started

    Pawan Kumar 19 Jul 17, 2020
    Displaying json models in a Flutter widget

    Displaying json models in a Flutter widget ?? Cool solution for viewing models in debug working Getting Started Add dependency dependencies: flutter

    Stanislav Ilin 27 Nov 16, 2021
    A Flutter plugin to use speech recognition on iOS & Android (Swift/Java)

    speech_recognition A flutter plugin to use the speech recognition iOS10+ / Android 4.1+ Basic Example Sytody, speech to todo app Installation Depend o

    Erick Ghaumez 321 Nov 17, 2021
    gui automation based on pyautogui python as backend and flutter desktop as frontend, drag and drop tool, no coding required.

    GUI_AUTOMATION gui automation based on pyautogui python as backend and flutter desktop as frontend, drag and drop tool, no coding required. Install py

    Hassan Kanso 26 Oct 28, 2021
    Flutter WebRTC demo with Python server to perform image processing on video frames using OpenCV

    flutter + webrtc => python + aiortc + opencv This demo project should help you to get setup sending a video feed from a flutter app to a python backen

    John Crisp 24 Nov 8, 2021
    This package binds to Cronet's native API to expose them in Dart.

    Experimental Cronet Dart bindings This package binds to Cronet's native API to expose them in Dart. This is an HTTP Client Package with almost the sam

    Google 59 Nov 25, 2021
    Flutter Shine is a library for pretty and realistic shadows, dynamic light positions, extremely customizable shadows, no library dependencies, text or box shadows based on content.

    Flutter Shine Show some ❤️ and star the repo to support the project Flutter widget inspired by Shine Installation Add the Package dependencies: flut

    Jonathan Monga 137 Nov 26, 2021
    Open screens/snackbars/dialogs/bottomSheets without context, manage states and inject dependencies easily with Get.

    Languages: English (this file), Indonesian, Urdu, Chinese, Brazilian Portuguese, Spanish, Russian, Polish, Korean, French. About Get Installing Counte

    Jonny Borges 5.3k Nov 23, 2021
    Flutter: Overriding Dependencies | Solving Version Conflicts | Solving Version Conflicts

    flutterutils A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you started if

    Pawan Kumar 49 Nov 18, 2021
    Open screens/snackbars/dialogs/bottomSheets without context, manage states and inject dependencies easily with Get.

    Open screens/snackbars/dialogs/bottomSheets without context, manage states and inject dependencies easily with Get.

    Jonny Borges 5.4k Dec 2, 2021
    Open screens/snackbars/dialogs/bottomSheets without context, manage states and inject dependencies easily with Get.

    Languages: English (this file), Indonesian, Urdu, Chinese, Brazilian Portuguese, Spanish, Russian, Polish, Korean, French. About Get Installing Counte

    Jonny Borges 5.3k Nov 24, 2021
    Flutter Sticky Headers - Lets you place "sticky headers" into any scrollable content in your Flutter app. No special wrappers or magic required. Maintainer: @slightfoot

    Flutter Sticky Headers Lets you place headers on scrollable content that will stick to the top of the container whilst the content is scrolled. Usage

    Flutter Community 728 Nov 23, 2021
    Flutter library for iOS Widgets Extensions. Integrate a Widget into your App 🍏📱

    flutter_widgetkit Flutter Library for the iOS ?? WidgetKit framework and Widget Communication Table of Contents ?? Introduction ??‍?? Installation ??‍

    Fasky 161 Nov 28, 2021
    A Flutter package to parse text and make them into linkified text widget

    ?? Flutter Parsed text A Flutter package to parse text and extract parts using predefined types like url, phone and email and also supports Regex. Usa

    Fayeed Pawaskar 188 Dec 1, 2021
    A Flutter plugin for turning your device into a beacon.

    Beacon Broadcast plugin for Flutter A Flutter plugin for turning your device into a beacon. Usage To use this plugin, add beacon_broadcast as a depend

    Paulina Szklarska 70 Nov 21, 2021
    A simple Flutter package that makes turning a FAB into a text field easy.

    flutter_text_field_fab A simple Flutter widget that makes turning a FAB into a text field easy.

    Haefele Software 4 Aug 21, 2021
    Flutter: QR Code Scanner App Flutter: QR Code Scanner App

    Flutter QRCode Scanner APP Show some ❤️ and star the repo to support the project A new Flutter project. Features Scan 2D barcodes Scan QR codes Contro

    Pawan Kumar 245 Nov 18, 2021
    Flutter Advanced: Background Fetch | Run code in the background Android & iOS | Run code in the background Android & iOS

    flutterbackground A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you starte

    Pawan Kumar 37 Aug 11, 2021
    Community WebView Plugin - Allows Flutter to communicate with a native WebView.

    NOTICE We are working closely with the Flutter Team to integrate all the Community Plugin features in the Official WebView Plugin. We will try our bes

    Flutter Community 1.4k Dec 1, 2021
    A Flutter plugin for allowing users to authenticate with native Android & iOS Facebook login SDKs.

    flutter_facebook_login A Flutter plugin for using the native Facebook Login SDKs on Android and iOS. AndroidX support if you want to avoid AndroidX, u

    Iiro Krankka 398 Nov 20, 2021
    Admob Flutter plugin that shows banner ads using native platform views.

    admob_flutter A Flutter plugin that uses native platform views to show Admob banner ads! This plugin also has support for Interstitial and Reward ads.

    Kevin McGill 411 Nov 18, 2021