Skip to content
#

Scala

scala logo

Scala is a general-purpose programming language providing support for functional programming and a strong static type system. Designed to be concise, many of Scala's design decisions aimed to address criticisms of Java.

Here are 9,671 public repositories matching this topic...

ignasi35
ignasi35 commented Nov 26, 2019

The Java minimal imports:

https://github.com/playframework/playframework/blob/576c0def93c7abe8df7802fb3549b2e02475005b/dev-mode/build-link/src/main/java/play/TemplateImports.java#L32

imports scala.collection.JavaConverters which is deprecated in scala 2.13. This triggers the warning/error (dependinc on your scalac options):

[error] /Users/ignasi/git/projects/lightbend/playframewo
hanbaoan123
hanbaoan123 commented Feb 24, 2020

Issue Description

When I run the example cartpole with the default parameters, it can not converge to the max reward 200, I wonder what went wrong.
360截图20200224095510956

Version Information

Please indicate relevant versions, including, if relevant:

  • Deeplearning4j versi
SIkebe
SIkebe commented Jan 2, 2020

Issue

Impacted version: 4.33.0

Deployment mode: standalone app

Repro steps:

  1. Wiki -> New Page
  2. Fill in each item and click "Save" button.
    • Page name: title+
    • Content: aaa
  3. Redirect to the edit page.
    • Page name: title instead of title+
    • Content: (blank)

As + is decoded to whitespace, should it be treated as an invalid character he

bug
adriancole
adriancole commented Aug 29, 2018

Let's support at least reading "b3" header from a single string, most commonly traceid-spanid-1
It would also be nice to support optionally writing this, especially in message providers or others with constrained environments.

Expected behavior

As discussed on openzipkin/b3-propagation#21 and first implemented here: https://github.com/openzipkin/brave/blob/master/brave/src/main/java/bra

lk-geimfari
lk-geimfari commented Aug 30, 2019

Well, we need to check all the projects and remove outdated ones.

We declare this in the readme:

However, keep in mind that we don't accept mammoth's shit. Only active and interesting projects with good documentation are added. Dead and abandoned projects will be removed.

But, sadly at this moment, this project looks like a mammoth shit itself.

We really need to fix it. I think

lila
fennuzhichui
fennuzhichui commented Jan 2, 2020

how to define java8 when submit application use spark-submit


name: Bug report/Feature request/Question
about: Create a report to help us improve
title: ''
label: bug/enhancement/question
assignees: ''

Environment:

  • Java version:
  • Scala version:
  • Spark version:
  • PyTorch and Python version:
  • OS and version:

Checklist:

  • Did you check if your bug/feature/
odersky
odersky commented Nov 16, 2019

Scala allows local blocks as in

def foo(x: Int) = {
  val y = x

  { val z = y * y
    println(x)
  }
  y -1 
}

This is dangerous, as it only works if there is an empty line in front of the local block. Dropping the empty line can give obscure type errors or even change the meaning of the program since then the block is treated as an argument to a function on the precedin

LeeTibbert
LeeTibbert commented Apr 6, 2020

Once Issue #1749 is merged, build.sbt should be edited to add the incantation
Global / onChangedBuildSource := ReloadOnSourceChanges

This is available as of sbt 1.3.8. I find it reduces my cycle time & frustration
when making edits to build.sbt, etc. Basically, I do not have to go through a
5 or 15 minute failure cycle because I forgot to do a manual reload. With that
setting, sbt det

yiheng
yiheng commented Jul 11, 2018

Spark 2.3 officially support run on kubernetes. While our guide of "Run on Kubernetes" is still based on a special version of Spark 2.2, which is out of date. We need to:

  1. update that document to Spark 2.3
  2. release the corresponding docker images.
casperdcl
casperdcl commented Jul 16, 2019

perhaps it would be nice to provide an example where injected parameters are inside a dictionary or object:

# parameters
import argparse
args = argparse.Namespace()
args_dict = {}

args.a = 1
args_dict['b'] = 2
papermill ... -p args.a 1.618 -p "args_dict['b']" 3.14159

This would of course be useful for making transition between *.py scripts (using e.g

yeikel
yeikel commented Jan 4, 2019

According to the generated build

The commands to launch are the following :

docker pull andypetrella/spark-notebook:0.7.0-scala-2.11.8-spark-2.1.1-hadoop-2.7.2-with-hive
docker run -p 9001:9001 andypetrella/spark-notebook:0.7.0-scala-2.11.8-spark-2.1.1-hadoop-2.7.2-with-hive

Using that image (and I think it i

Created by Martin Odersky

Released January 20, 2004

Website
www.scala-lang.org
Wikipedia
Wikipedia
You can’t perform that action at this time.