Skip to content
#

artificial-intelligence

The branch of computer science dealing with the reproduction, or mimicking of human-level intelligence, self-awareness, knowledge, conscience, and thought in computer programs.

Here are 7,658 public repositories matching this topic...

ines
ines commented Sep 29, 2019

I was going though the existing enhancement issues again and though it'd be nice to collect ideas for spaCy plugins and related projects. There are always people in the community who are looking for new things to build, so here's some inspiration For existing plugins and projects, check out the spaCy universe.

If you have questions about the projects I suggested,

hanbaoan123
hanbaoan123 commented Feb 24, 2020

Issue Description

When I run the example cartpole with the default parameters, it can not converge to the max reward 200, I wonder what went wrong.
360截图20200224095510956

Version Information

Please indicate relevant versions, including, if relevant:

  • Deeplearning4j versi
xfan1024
xfan1024 commented Jan 7, 2020

我发现examples/retinaface.cpp中,如果开启OMP加速的话似乎在检测到人脸时会发生内存泄漏,但我定位不了这个问题的具体原因。

值得注意的时,如果将qsort_descent_inplace函数中的OMP指令注释掉这个问题就会消失掉。

static void qsort_descent_inplace(std::vector<FaceObject>& faceobjects, int left, int right)
{
    int i = left;
    int j = right;
    float p = faceobjects[(left + right) / 2].prob;
    ...
    // #pragma omp parallel sections
    {
        // #pragma
loomlike
loomlike commented Apr 15, 2019

Description

Add Azure notebook to our SETUP doc.
I tested google colab and Azure notebook to run reco-repo without requiring creating any DSVM or compute by myself, and it works really well with simple tweaks to the notebooks (e.g. for some libs, should install manually).

I think it would be good to add at least Azure notebook to our SETUP doc, where users can easily test out our repo w/o

bodgergely
bodgergely commented Jun 12, 2017

Upon environment timeout python client will only receive the error message "Environment in wrong status for call to observations()". Might be good to provide more information why the environment is not running anymore (due to timeout etc.)

if (!is_running(self)) {
  PyErr_SetString(PyExc_RuntimeError,
  "Environment in wrong status for call to observations()");
  return NULL;
}
RootChenLQ
RootChenLQ commented Oct 30, 2019

There is nan value in multistepBucketLikelihoods, when I use my own dataset, and set _NUM_RECORDS as 6000. The error is listed as below.

multistepBucketLikelihoods = {1: {499: 1.0}, 5: {499: nan, 501: 0.0}}
File "D:\ProgramData\PythonWorkspace\nupic\docs\examples\opf\test.py", line 52, in runHotgym fiveStepConfidence = allPredictions[5][fiveStep]
File "D:\ProgramData\PythonWorkspace\nup
tensorlayer
0xtyls
0xtyls commented Jan 3, 2020

I understand that these two python files show two different methods to construct a model. The original n_epoch is 500 which works perfect for both python files. But if I change n_epoch to 20, only tutorial_mnist_mlp_static.py can achieve a high test accuracy (~0.97). The other file tutorial_mnist_mlp_static_2.py only get 0.47.

The models built from these two files looks the same for me (the s

pytorch-lightning
MaJian199609
MaJian199609 commented Mar 16, 2019

I'm submitting a ... (check one with "x")

[ ] bug report
[ ] help wanted
[ ] feature request

Current behavior

Expected/desired behavior

Reproduction of the problem

If the current behavior is a bug or you can illustrate your feature request better with an example, please provide the steps to reproduce.

What is the expected behavior?

LeonardoDavid
LeonardoDavid commented Nov 14, 2018
  • Operating System: Windows
  • Serpent.AI Version: not sure
  • Game: (Cuphead) Executable
  • Backend: GPU

i followed the hello world tutorial, created plug in for cuphead executable game when i launch the game i get this error

(AI) C:\Users\ANTONY\SerpentAI>serpent launch cuphead
Traceback (most recent call last):
  File "c:\programdata\anaconda3\envs\ai\lib\runpy
auphofBSF
auphofBSF commented Aug 21, 2019

A more consistent and multi-functional global level of verbosity control,
suggest an enhancement that will see print(...) in project be converted to using the python logging. module

import logging
#Then instead of print() use either
logging.info(......)
#or
logging.debug(.....)
#or
logging.warning(....)
#or
#logging.error()

In that way verbosity can be globally

carla
You can’t perform that action at this time.