Scala

Scala is a general-purpose programming language providing support for functional programming and a strong static type system. Designed to be concise, many of Scala's design decisions aimed to address criticisms of Java.
Here are 10,158 public repositories matching this topic...
-
Updated
Jul 16, 2020 - Scala
In #10199 we added symbolic methods to TypeMap, but as signaled by @NthPortal we should avoid this.
Triggered by:
I'm a bit late to the party here, but multiple (non-implicit) parameters to a symbolic/infix method are highly discouraged, and planned to be deprecated. See scala/scala-dev#496 and lampepfl/dotty#4311 (comment).
_Originally posted by @n
I am trying to deploy the app with the given ./sbt clean dist
command but I got this error:
Downloading sbt launcher for 1.3.8:
From https://repo.scala-sbt.org/scalasbt/maven-releases/org/scala-sbt/sbt-launch/1.3.8/sbt-launch-1.3.8.jar
To /root/.sbt/launchers/1.3.8/sbt-launch.jar
Downloading sbt launcher 1.3.8 md5 hash:
From https://repo.scala-sbt.org/scalasbt/maven-releas
Issue
Impacted version: 4.33.0
Deployment mode: standalone app
Repro steps:
- Wiki -> New Page
- Fill in each item and click "Save" button.
- Page name:
title+
- Content:
aaa
- Page name:
- Redirect to the edit page.
- Page name:
title
instead oftitle+
- Content: (blank)
- Page name:
As +
is decoded to whitespace, should it be treated as an invalid character he
Let's support at least reading "b3" header from a single string, most commonly traceid-spanid-1
It would also be nice to support optionally writing this, especially in message providers or others with constrained environments.
Expected behavior
As discussed on openzipkin/b3-propagation#21 and first implemented here: https://github.com/openzipkin/brave/blob/master/brave/src/main/java/bra
-
Updated
Jul 7, 2020 - Python
-
Updated
Jul 3, 2020 - TeX
Well, we need to check all the projects and remove outdated ones.
We declare this in the readme:
However, keep in mind that we don't accept mammoth's shit. Only active and interesting projects with good documentation are added. Dead and abandoned projects will be removed.
But, sadly at this moment, this project looks like a mammoth shit itself.
We really need to fix it. I think
A form page to help build download URLs for this API: https://lichess.org/api#operation/apiGamesUser
I think something like this already exists in the wild, but it would be nice to make part of lichess.
Use the existing form CSS, like on https://lichess.org/games/search or https://lichess.org/tournament/new
The form is never submitted, instead it generates download URLs on client side wit
Small omission in the guide: it is implied in step 9 that a {}
literal should be parsed as a hash-map in the reader, but this is never explicitly stated earlier on. The sentence in question is: "This is basically the functional form of the {}
reader literal syntax".
how to define java8 when submit application use spark-submit
name: Bug report/Feature request/Question
about: Create a report to help us improve
title: ''
label: bug/enhancement/question
assignees: ''
Environment:
- Java version:
- Scala version:
- Spark version:
- PyTorch and Python version:
- OS and version:
Checklist:
- Did you check if your bug/feature/
Show[Throwable]
I just came across the fact that a Show[Throwable]
exists.
- It is not wired up into
import Scalaz._
, one needs toimport scalaz.std.java.throwable._
- It discards the stack trace entirely.
What's going on with this? :D
Is it OK if I fix both of these (for 7.2 and 7.3)?
steps
- Open https://www.scala-sbt.org/1.x/api/sbt/util/PlainInput.html
- Click on source link "Input.scala"
problem
- Link 404's
expectation
- Link links to source on github
notes
In http://dotty.epfl.ch/docs/reference/contextual/relationship-implicits.html#given-instances the documentation says Given instances without parameters are mapped to implicit objects. E.g.,
. However, doing the translation manually shows that given instances are not implicit objects – implicit objects behave directly with the respect to the specificity rule.
- givens produce ambiguity error -
Per the BufferedInputStream
documentation, the skip(n)
method corresponds to the general contract in InputStream
. Which is:
"If n is negative, the skip method for class InputStream
always returns 0, and no bytes are skipped. Subclasses may handle the negative value differently. "
From this I'd expect scala-native's BufferedInputStream.skip
to return 0. Instead an `IllegalArgumentEx
Plotting UI revamp
The plotting UI leaves a lot to be desired:
- When you bring up the plotting UI, it's not obvious that you have to do some work to make a plot happen. If you don't, there's no error message, just a never-ending wait. (see https://gitter.im/polynote/polynote?at=5e0e36f9eac8d1511e9ed2ff )
- Why do we make you drag things onto axes? Especially when there is only one axis it could go onto (at leas
Spark 2.3 officially support run on kubernetes. While our guide of "Run on Kubernetes" is still based on a special version of Spark 2.2, which is out of date. We need to:
- update that document to Spark 2.3
- release the corresponding docker images.
-
Updated
Jun 22, 2020 - Python
Hi all!,
Just want to share with the team some details I've been experiencing while I executed notebooks from command line using a yaml file.
First, let me show my case. I've bee parametrizing different notebooks to isolate data wranlging processes. To do it, I needed to use lists of dictionaries to specifiy keys describing my data, such as area or paths where some files were stored. As all
-
Updated
Jul 6, 2020 - Kotlin
add microsite?
According to the generated build
The commands to launch are the following :
docker pull andypetrella/spark-notebook:0.7.0-scala-2.11.8-spark-2.1.1-hadoop-2.7.2-with-hive
docker run -p 9001:9001 andypetrella/spark-notebook:0.7.0-scala-2.11.8-spark-2.1.1-hadoop-2.7.2-with-hive
Using that image (and I think it i
I've noticed all the links on template homepages are broken.
I cant find where these links are set though. It doesnt seem to be here
<img width="1043" alt="sc