Skip to content
#

distributed-computing

Here are 916 public repositories matching this topic...

ogvalt
ogvalt commented Apr 25, 2020

Describe the bug
I found that some names agruments in framework aren't consistent.
So for example:

class SupervisedRunner(Runner):
    """Runner for experiments with supervised model."""

    _experiment_fn: Callable = SupervisedExperiment

    def __init__(
        self,
        model: Model = None,
        device: Device = None,
        input_key: Any = "features", 
      
jcrist
jcrist commented Sep 15, 2020

Clients created with worker_client or get_client don't respect the timeout settings (e.g. distributed.comm.timeouts.connect. The timeout is available to set programmatically, but defaults to 3 rather than falling back to the config file. I think this should be as simple as replacing timeout=3 with timeout=None throughout that code path.

A full stack, reactive architecture for general purpose programming. Algebraic and monadically composable primitives for concurrency, parallelism, event handling, transactions, multithreading, Web, and distributed computing with complete de-inversion of control (No callbacks, no blocking, pure state)

  • Updated Mar 2, 2020
  • Haskell
bcyphers
bcyphers commented Jan 31, 2018

If enter_data() is called with the same train_path twice in a row and the data itself hasn't changed, a new Dataset does not need to be created.

We should add a column which stores some kind of hash of the actual data. When a Dataset would be created, if the metadata and data hash are exactly the same as an existing Dataset, nothing should be added to the ModelHub database and the existing

Improve this page

Add a description, image, and links to the distributed-computing topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the distributed-computing topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.