-
Updated
Sep 5, 2020
distributed-computing
Here are 916 public repositories matching this topic...
-
Updated
Sep 27, 2020 - Go
-
Updated
Apr 20, 2020 - Go
-
Updated
Mar 14, 2017 - Python
Naming inconsistency
Describe the bug
I found that some names agruments in framework aren't consistent.
So for example:
class SupervisedRunner(Runner):
"""Runner for experiments with supervised model."""
_experiment_fn: Callable = SupervisedExperiment
def __init__(
self,
model: Model = None,
device: Device = None,
input_key: Any = "features",
-
Updated
Oct 2, 2020 - Python
-
Updated
Oct 15, 2020 - PHP
Clients created with worker_client
or get_client
don't respect the timeout settings (e.g. distributed.comm.timeouts.connect
. The timeout is available to set programmatically, but defaults to 3
rather than falling back to the config file. I think this should be as simple as replacing timeout=3
with timeout=None
throughout that code path.
-
Updated
Oct 24, 2017 - Python
-
Updated
Oct 16, 2020 - C#
-
Updated
Oct 16, 2020 - HTML
-
Updated
Oct 9, 2020 - C
-
Updated
May 12, 2020 - Java
-
Updated
Oct 6, 2020 - Python
-
Updated
Sep 24, 2020 - PHP
-
Updated
Oct 15, 2020 - C++
-
Updated
Aug 2, 2020 - C#
-
Updated
Oct 16, 2020 - R
-
Updated
Sep 30, 2020
-
Updated
Dec 16, 2018 - Rust
-
Updated
Mar 2, 2020 - Haskell
-
Updated
Oct 13, 2020 - JavaScript
-
Updated
Sep 28, 2020 - Go
-
Updated
Apr 6, 2020 - C++
If enter_data()
is called with the same train_path
twice in a row and the data itself hasn't changed, a new Dataset does not need to be created.
We should add a column which stores some kind of hash of the actual data. When a Dataset would be created, if the metadata and data hash are exactly the same as an existing Dataset, nothing should be added to the ModelHub database and the existing
-
Updated
Nov 16, 2019 - C++
-
Updated
Nov 5, 2019
-
Updated
Sep 3, 2020 - Ruby
-
Updated
Dec 9, 2016 - C++
Improve this page
Add a description, image, and links to the distributed-computing topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the distributed-computing topic, visit your repo's landing page and select "manage topics."
How do i resume training for text classification?