Stars
4
Forks
4
Language
C++
Last Updated
Jun 01, 2022
Similar Repos
Repo | Language | Stars | Description | Updated At |
---|---|---|---|---|
Jupyter Notebook | 21 | Neural Network Bayesian Kernel Inference. | Oct 17, 2022 | |
Python | 5 | Likelihood Inference Neural Network Accelerator | Mar 07, 2023 | |
Python | 5 | Neural network inference on serverless architecture | May 09, 2021 | |
C | 10 | a simple neural network inference framework | May 14, 2023 | |
C++ | 323 | Benchmarking Neural Network Inference on Mobile Devices | Jul 21, 2022 | |
C++ | 2 | Benchmarking Neural Network Inference on Mobile Devices | May 14, 2021 | |
C++ | 10 | Private and Reliable Neural Network Inference (CCS '22) | Apr 08, 2023 | |
Python | 11 | A fast minimally-intrusive neural network inference library | Apr 13, 2023 | |
Python | 29 | Simple inference codes for Neural Network (AI) models | Mar 20, 2023 | |
Jupyter Notebook | 7 | Membership Inference Attacks and Defenses in Neural Network Pruning | Apr 16, 2022 | |
C++ | 10 | Deep Neural Network inference using Xilinx Zynq-7000 chip. | Jul 16, 2022 | |
Python | 502 | Neural Network Compression Framework for enhanced OpenVINO™ inference | Aug 09, 2022 | |
Python | 2 | Feature Space Particle Inference for Neural Network Ensembles (ICML2022) | Jun 03, 2022 | |
C++ | 3 | MNN is a lightweight deep neural network inference engine. | May 14, 2021 | |
C++ | 10 | Escoin: Efficient Sparse Convolutional Neural Network Inference on GPUs | Apr 03, 2022 | |
C++ | 4 | Lightspeed C++ Neural Network (UE) Inference Library for Chess | Apr 14, 2023 | |
None | 2 | MNN is a lightweight deep neural network inference engine. | Oct 15, 2021 | |
Matlab | 7 | Neural Variational Inference and Learning for Sigmoid Belief Network | Mar 24, 2022 | |
C++ | 10 | Deep neural network inference transpiler tool for tflite and NNAPI | Feb 24, 2023 | |
Jupyter Notebook | 223 | A journey into Convolutional Neural Network visualization | Jul 28, 2022 | |
R | 4 | Neural Network Weights Transformation into Polynomial Coefficients | May 08, 2023 | |
C++ | 9 | GUI demo of deep neural network inference with ncnn and imgui | Jul 18, 2022 | |
C++ | 20 | SNIG: Accelerated Large Sparse Neural Network Inference using Task Graph Parallelism | Mar 30, 2023 | |
C++ | 3 | A tensorflow-compatible,dependency-free lightweight C++ library for neural network inference. | Aug 19, 2020 | |
C++ | 106 | AMD OpenVX modules: such as, neural network inference, 360 video stitching, etc. | Apr 23, 2022 | |
C++ | 5 | Improved Secure 3-Party Neural Network Inference with Reducing Online Communication Costs | Feb 15, 2023 | |
PHP | 11 | Integrate your OSSN powered social network into 3rd party application | Mar 24, 2023 | |
None | 4 | High-efficiency floating-point neural network inference operators for mobile, server, and Web | Jul 12, 2022 | |
C | 1190 | High-efficiency floating-point neural network inference operators for mobile, server, and Web | Aug 30, 2022 | |
C++ | 3 | A dataflow architecture for universal graph neural network inference via multi-queue streaming. | Jul 08, 2022 | |
Jupyter Notebook | 2 | Light-weighted neural network inference for object detection on small-scale FPGA board | Jan 15, 2022 | |
Python | 104 | Neural network to convert a sketch into a face. | Feb 27, 2023 | |
Python | 23 | cat dog neural network implemented into website using flask | Mar 05, 2023 | |
Python | 2 | JiaxiongWeng-Conor/integrate-ChatGPT-into-LINEJiaxiongWeng-Conor/integrate-ChatGPT-into-LINEJiaxiongWeng-Conor/integrate-ChatGPT-into-LINE | Apr 13, 2023 | |
C++ | 5 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Aug 09, 2022 | |
Python | 843 | Neural network inference engine that delivers GPU-class performance for sparsified models on CPUs | Aug 17, 2022 | |
C++ | 15286 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Aug 22, 2022 | |
C++ | 2 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Mar 05, 2023 | |
None | 2 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Apr 12, 2022 | |
C | 3 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Mar 25, 2022 | |
C++ | 89 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Aug 08, 2022 | |
C | 11 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Apr 20, 2023 | |
C | 2 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Dec 07, 2022 | |
Kotlin | 10 | This repo demonstrates how to use KotlinDL for neural network inference on Android devices. | Nov 26, 2022 | |
C++ | 2 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Feb 16, 2023 | |
C++ | 2 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Aug 28, 2022 | |
C++ | 13 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Apr 28, 2022 | |
C++ | 8 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Aug 20, 2020 | |
C | 66 | ffcnn is a cnn neural network inference framework, written in 600 lines C language. | May 18, 2023 | |
C++ | 3 | ncnn is a high-performance neural network inference framework optimized for the mobile platform | Oct 19, 2023 |