Asked  7 Months ago    Answers:  5   Viewed   16 times

One way that has been suggested to deal with double definitions of overloaded methods is to replace overloading with pattern matching:

object Bar {
   def foo(xs: Any*) = xs foreach { 
      case _:String => println("str")
      case _:Int => println("int")
      case _ => throw new UglyRuntimeException()
   }
}

This approach requires that we surrender static type checking on the arguments to foo. It would be much nicer to be able to write

object Bar {
   def foo(xs: (String or Int)*) = xs foreach {
      case _: String => println("str")
      case _: Int => println("int")
   }
}

I can get close with Either, but it gets ugly fast with more than two types:

type or[L,R] = Either[L,R]

implicit def l2Or[L,R](l: L): L or R = Left(l)
implicit def r2Or[L,R](r: R): L or R = Right(r)

object Bar {
   def foo(xs: (String or Int)*) = xs foreach {
      case Left(l) => println("str")
      case Right(r) => println("int")
   }
}

It looks like a general (elegant, efficient) solution would require defining Either3, Either4, .... Does anyone know of an alternate solution to achieve the same end? To my knowledge, Scala does not have built-in "type disjunction". Also, are the implicit conversions defined above lurking in the standard library somewhere so that I can just import them?

 Answers

43

Well, in the specific case of Any*, this trick below won't work, as it will not accept mixed types. However, since mixed types wouldn't work with overloading either, this may be what you want.

First, declare a class with the types you wish to accept as below:

class StringOrInt[T]
object StringOrInt {
  implicit object IntWitness extends StringOrInt[Int]
  implicit object StringWitness extends StringOrInt[String]
}

Next, declare foo like this:

object Bar {
  def foo[T: StringOrInt](x: T) = x match {
    case _: String => println("str")
    case _: Int => println("int")
  }
}

And that's it. You can call foo(5) or foo("abc"), and it will work, but try foo(true) and it will fail. This could be side-stepped by the client code by creating a StringOrInt[Boolean], unless, as noted by Randall below, you make StringOrInt a sealed class.

It works because T: StringOrInt means there's an implicit parameter of type StringOrInt[T], and because Scala looks inside companion objects of a type to see if there are implicits there to make code asking for that type work.

Tuesday, June 1, 2021
 
Sabya
answered 7 Months ago
84

There is no specific type in TypeScript that corresponds to your desired structure. String index signatures must apply to every property, even the manually declared ones like id. What you're looking for is something like a "rest index signature" or a "default property type", and there is an open suggestion in GitHub asking for this: microsoft/TypeScript#17867. A while ago there was some work done that would have enabled this, but it was shelved (see this comment for more info). So it's not clear when or if this will happen.


You could widen the type of the index signature property so it includes the hardcoded properties via a union, like

type WidenedT = {
    id: number;
    [key: string]: string | number
}

but then you'd have to test every dynamic property before you could treat it as a string:

function processWidenedT(t: WidenedT) {
    t.id.toFixed(); // okay
    t.random.toUpperCase(); // error
    if (typeof t.random === "string") t.random.toUpperCase(); // okay
}

The best way to proceed here would be if you could refactor your JavaScript so that it doesn't "mix" the string-valued bag of properties with a number-valued id. For example:

type RefactoredT = {
    id: number;
    props: { [k: string]: string };
}

Here id and props are completely separate and you don't have to do any complicated type logic to figure out whether your properties are number or string valued. But this would require a bunch of changes to your existing JavaScript and might not be feasible.

From here on out I'll assume you can't refactor your JavaScript. But notice how clean the above is compared to the messy stuff that's coming up:


One common workaround to the lack of rest index signatures is to use an intersection type to get around the constraint that index signatures must apply to every property:

type IntersectionT = {
    id: number;
} & { [k: string]: string };

It sort of kind of works; when given a value of type IntersectionT, the compiler sees the id property as a number and any other property as a string:

function processT(t: IntersectionT) {
    t.id.toFixed(); // okay
    t.random.toUpperCase(); // okay
    t.id = 1; // okay
    t.random = "hello"; // okay
}

But unfortunately you can't assign an object literal to that type without the compiler complaining:

t = { id: 1, random: "hello" }; // error!
// Property 'id' is incompatible with index signature.

You have to work around that further by doing something like Object.assign():

const propBag: { [k: string]: string } = { random: "" };
t = Object.assign({ id: 1 }, propBag);

But this is annoying, since most users will never think to synthesize an object in such a roundabout way.


A different approach is to use a generic type to represent your type instead of a specific type. Think of writing a type checker that takes as input a candidate type, and returns something compatible if and only if that candidate type matches your desired structure:

type VerifyT<T> = { id: number } & { [K in keyof T]: K extends "id" ? unknown : string };

This will require a generic helper function so you can infer the generic T type, like this:

const asT = <T extends VerifyT<T>>(t: T) => t;

Now the compiler will allow you to use object literals and it will check them the way you expect:

asT({ id: 1, random: "hello" }); // okay
asT({ id: "hello" }); // error! string is not number
asT({ id: 1, random: 2 }); // error!  number is not string
asT({ id: 1, random: "", thing: "", thang: "" }); // okay

It's a little harder to read a value of this type with unknown keys, though. The id property is fine, but other properties will not be known to exist, and you'll get an error:

function processT2<T extends VerifyT<T>>(t: T) {
    t.id.toFixed(); // okay
    t.random.toUpperCase(); // error! random not known to be a property
}

Finally, you can use a hybrid approach that combines the best aspects of the intersection and generic types. Use the generic type to create values, and the intersection type to read them:

function processT3<T extends VerifyT<T>>(t: T): void;
function processT3(t: IntersectionT): void {
    t.id.toFixed();
    if ("random" in t)
        t.random.toUpperCase(); // okay
}
processT3({ id: 1, random: "hello" });

The above is an overloaded function, where callers see the generic type, but the implementation sees the intersection type.


Playground link to code

Tuesday, June 1, 2021
 
TecHunter
answered 7 Months ago
14

You can do smth like that:

type Key = `data-element-${1|2|3|4|5|6|7|8|9|0}`

const obj:Record<Key, string> = {
    'data-element-0': 'something',
    'data-element-1': 'something else',
    'data-element-2': 'something as well',
    'data-element-3': 'something to feel included',
    'data-element-yu': 'something to feel included', // error
};

UPDATE I have also make helpers fot double numbers, from: 0-99:

type NonZeroDigit = '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9'

type NumberHelper = {
  [P in NonZeroDigit]: {
    [Z in NonZeroDigit]: `${P}${Z}`
  }
}

type NestedValues<T extends Record<string, Record<string, string>>> = {
  [P in keyof T]: P extends string ? Values<T[P]> : never
}
type Values<T> = T[keyof T]

type RemoveTrailingZero<T extends string> = T extends `${infer Fst}${infer Snd}` ? Fst extends `0` ? `${Snd}` : `${Fst}${Snd}` : never;

type Numbers_99 = RemoveTrailingZero<Values<NestedValues<NumberHelper>>>

UPDATE

Here you have an util for generating number range from 0 to 99999

type Values<T> = T[keyof T]

type LiteralDigits = 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
type NumberString<T extends number> = `${T}`

type AppendDigit<T extends number | string> = `${T}${LiteralDigits}`

type MakeSet<T extends number> = {
    [P in T]: AppendDigit<P>
}

type RemoveTrailingZero<T extends string> = T extends `${infer Fst}${infer Rest}` ? Fst extends `0` ? RemoveTrailingZero<`${Rest}`> : `${Fst}${Rest}` : never;

type From_1_to_999 = RemoveTrailingZero<Values<{
    [P in Values<MakeSet<LiteralDigits>>]: AppendDigit<P>
}>>

type By<V extends NumberString<number>> = RemoveTrailingZero<Values<{
    [P in V]: AppendDigit<P>
}>>

type From_1_to_99999 =
    | From_1_to_999
    | By<From_1_to_999>
    | By<From_1_to_999
        | By<From_1_to_999>>

Demo

UPDATE 3

If you still want to generate literal numbers, not string numbers, you can use this code, which has been honestly stolen from here

type PrependNextNum<A extends Array<unknown>> = A['length'] extends infer T ? ((t: T, ...a: A) => void) extends ((...x: infer X) => void) ? X : never : never;

type EnumerateInternal<A extends Array<unknown>, N extends number> = { 0: A, 1: EnumerateInternal<PrependNextNum<A>, N> }[N extends A['length'] ? 0 : 1];

type Enumerate<N extends number> = EnumerateInternal<[], N> extends (infer E)[] ? E : never;

type Result = Enumerate<43> // 0 | 1 | 2 | ... | 42
Saturday, July 31, 2021
 
Manju
answered 4 Months ago
76

The performance problem has nothing to do with the way the data is read. It is already buffered. Nothing happens until you actually iterate through the lines:

// measures time taken by enclosed code
def timed[A](block: => A) = {
  val t0 = System.currentTimeMillis
  val result = block
  println("took " + (System.currentTimeMillis - t0) + "ms")
  result
}

val source = timed(scala.io.Source.fromFile("test.txt")) // 200mb, 500 lines
// took 0ms

val lines = timed(source.getLines)
// took 0ms

timed(lines.next) // read first line
// took 1ms

// ... reset source ...

var x = 0
timed(lines.foreach(ln => x += ln.length)) // "use" every line
// took 421ms

// ... reset source ...

timed(lines.toArray)
// took 915ms

Considering a read-speed of 500mb per second for my hard drive, the optimum time would be at 400ms for the 200mb, which means that there is no room for improvements other than not converting the iterator to an array.

Depending on your application you could consider using the iterator directly instead of an Array. Because working with such a huge array in memory will definitely be a performance issue anyway.


Edit: From your comments I assume, that you want to further transform the array (Maybe split the lines into columns as you said you are reading a numeric array). In that case I recommend to do the transformation while reading. For example:

source.getLines.map(_.split(",").map(_.trim.toInt)).toArray

is considerably faster than

source.getLines.toArray.map(_.split(",").map(_.trim.toInt))

(For me it is 1.9s instead of 2.5s) because you don't transform an entire giant array into another but just each line individually, ending up in one single array (Uses only half the heap space). Also since reading the file is a bottleneck, transforming while reading has the benefit that it results in better CPU utilization.

Thursday, August 12, 2021
 
medhybrid
answered 4 Months ago
37

It seems to suggest you being consistent with the method call usage. Either everything in infix form:

(t contains true) && (f contains false)

Or everything in regular method call form:

t.contains(true).&&(f.contains(false))
Tuesday, November 9, 2021
 
Jan Kleinert
answered 3 Weeks ago
Only authorized users can answer the question. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :  
Share