Scala

Scala is a general-purpose programming language providing support for functional programming and a strong static type system. Designed to be concise, many of Scala's design decisions aimed to address criticisms of Java.
Here are 9,881 public repositories matching this topic...
-
Updated
May 14, 2020 - Scala
In #10199 we added symbolic methods to TypeMap, but as signaled by @NthPortal we should avoid this.
Triggered by:
I'm a bit late to the party here, but multiple (non-implicit) parameters to a symbolic/infix method are highly discouraged, and planned to be deprecated. See scala/scala-dev#496 and lampepfl/dotty#4311 (comment).
_Originally posted by @n
I am trying to deploy the app with the given ./sbt clean dist
command but I got this error:
Downloading sbt launcher for 1.3.8:
From https://repo.scala-sbt.org/scalasbt/maven-releases/org/scala-sbt/sbt-launch/1.3.8/sbt-launch-1.3.8.jar
To /root/.sbt/launchers/1.3.8/sbt-launch.jar
Downloading sbt launcher 1.3.8 md5 hash:
From https://repo.scala-sbt.org/scalasbt/maven-releas
Issue
Impacted version: 4.33.0
Deployment mode: standalone app
Repro steps:
- Wiki -> New Page
- Fill in each item and click "Save" button.
- Page name:
title+
- Content:
aaa
- Page name:
- Redirect to the edit page.
- Page name:
title
instead oftitle+
- Content: (blank)
- Page name:
As +
is decoded to whitespace, should it be treated as an invalid character he
Let's support at least reading "b3" header from a single string, most commonly traceid-spanid-1
It would also be nice to support optionally writing this, especially in message providers or others with constrained environments.
Expected behavior
As discussed on openzipkin/b3-propagation#21 and first implemented here: https://github.com/openzipkin/brave/blob/master/brave/src/main/java/bra
-
Updated
May 14, 2020 - Python
The images are quite nice but sometimes they feel kinda cheap. It would be nice if they could be redrawn with TikZ.
This would give a touch of class to a book that is already beautiful.
Simple solution:
?
steps
- Open https://www.scala-sbt.org/1.x/api/sbt/util/PlainInput.html
- Click on source link "Input.scala"
problem
- Link 404's
expectation
- Link links to source on github
notes
In http://dotty.epfl.ch/docs/reference/contextual/relationship-implicits.html#given-instances the documentation says Given instances without parameters are mapped to implicit objects. E.g.,
. However, doing the translation manually shows that given instances are not implicit objects – implicit objects behave directly with the respect to the specificity rule.
- givens produce ambiguity error -
Per the BufferedInputStream
documentation, the skip(n)
method corresponds to the general contract in InputStream
. Which is:
"If n is negative, the skip method for class InputStream
always returns 0, and no bytes are skipped. Subclasses may handle the negative value differently. "
From this I'd expect scala-native's BufferedInputStream.skip
to return 0. Instead an `IllegalArgumentEx
Plotting UI revamp
The plotting UI leaves a lot to be desired:
- When you bring up the plotting UI, it's not obvious that you have to do some work to make a plot happen. If you don't, there's no error message, just a never-ending wait. (see https://gitter.im/polynote/polynote?at=5e0e36f9eac8d1511e9ed2ff )
- Why do we make you drag things onto axes? Especially when there is only one axis it could go onto (at leas
-
Updated
May 11, 2020 - Python
Spark 2.3 officially support run on kubernetes. While our guide of "Run on Kubernetes" is still based on a special version of Spark 2.2, which is out of date. We need to:
- update that document to Spark 2.3
- release the corresponding docker images.
-
Updated
May 14, 2020 - Kotlin
Hi all!,
Just want to share with the team some details I've been experiencing while I executed notebooks from command line using a yaml file.
First, let me show my case. I've bee parametrizing different notebooks to isolate data wranlging processes. To do it, I needed to use lists of dictionaries to specifiy keys describing my data, such as area or paths where some files were stored. As all
add microsite?
According to the generated build
The commands to launch are the following :
docker pull andypetrella/spark-notebook:0.7.0-scala-2.11.8-spark-2.1.1-hadoop-2.7.2-with-hive
docker run -p 9001:9001 andypetrella/spark-notebook:0.7.0-scala-2.11.8-spark-2.1.1-hadoop-2.7.2-with-hive
Using that image (and I think it i
I've noticed all the links on template homepages are broken.
I cant find where these links are set though. It doesnt seem to be here
<img width="1043" alt="sc