85 Commits

Author SHA1 Message Date
Motiejus Jakštys
d41111d7ef mem: rename align*Generic to mem.align*
Anecdote 1: The generic version is way more popular than the non-generic
one in Zig codebase:

     git grep -w alignForward | wc -l
    56
     git grep -w alignForwardGeneric | wc -l
    149

     git grep -w alignBackward | wc -l
    6
     git grep -w alignBackwardGeneric | wc -l
    15

Anecdote 2: In my project (turbonss) that does much arithmetic and
alignment I exclusively use the Generic functions.

Anecdote 3: we used only the Generic versions in the Macho Man's linker
workshop.
2023-06-17 12:49:13 -07:00
r00ster91
2593156068 migration: std.math.{min, min3, max, max3} -> @min & @max 2023-06-16 13:44:09 -07:00
sentientwaffle
f04e65bc09 std.hash_map: fetchRemove increment available
To avoid leaking slots, `fetchRemove` must increment `available` (when the "fetch" succeeds).

Without the `available += 1`, the added test `"std.hash_map repeat fetchRemove"` fails with

    run test std-x86-linux-none-Debug: error: thread 432734 panic: integer overflow
    .../zig/lib/std/hash_map.zig:1365:28: 0x6471d5 in getOrPutAssumeCapacityAdapted__anon_47495 (test)
            self.available -= 1;
                           ^
    .../zig/lib/std/hash_map.zig:1308:62: 0x616950 in getOrPutAssumeCapacityContext (test)
            const result = self.getOrPutAssumeCapacityAdapted(key, ctx);
                                                             ^

Alternatively, `fetchRemove` could call `removeByIndex`, though that would entail calling `header()` twice instead of once.
2023-06-10 02:24:35 +03:00
mlugg
4976b58ab1
Prevent analysis of functions only referenced at comptime
The idea here is that there are two ways we can reference a function at runtime:

* Through a direct call, i.e. where the function is comptime-known
* Through a function pointer

This means we can easily perform a form of rudimentary escape analysis
on functions. If we ever see a `decl_ref` or `ref` of a function, we
have a function pointer, which could "leak" into runtime code, so we
emit the function; but for a plain `decl_val`, there's no need to.

This change means that `comptime { _ = f; }` no longer forces a function
to be emitted, which was used for some things (mainly tests). These use
sites have been replaced with `_ = &f;`, which still triggers analysis
of the function body, since you're taking a pointer to the function.

Resolves: #6256
Resolves: #15353
2023-05-29 23:06:08 +01:00
Dominic
5a3eca5d4c
Disallow named test decls with duplicate names 2023-05-08 10:59:06 +03:00
Linus Groh
94e30a756e std: fix a bunch of typos
The majority of these are in comments, some in doc comments which might
affect the generated documentation, and a few in parameter names -
nothing that should be breaking, however.
2023-04-30 18:16:04 -07:00
Andrew Kelley
401b7f6f53 zig fmt 2023-04-25 11:23:41 -07:00
Andrew Kelley
a5c910adb6 change semantics of @memcpy and @memset
Now they use slices or array pointers with any element type instead of
requiring byte pointers.

This is a breaking enhancement to the language.

The safety check for overlapping pointers will be implemented in a
future commit.

closes #14040
2023-04-25 11:23:40 -07:00
-k
0282c2a924 doc: fix minor grammar issues 2023-04-22 13:09:15 +03:00
Jacob Young
a3529c2dea tools: implement more lldb pretty printers 2023-02-27 05:37:03 -05:00
Jacob Young
1a1598daf0 hash map: remove extra argument 2023-02-21 00:27:12 -05:00
Andrew Kelley
aeaef8c0ff update std lib and compiler sources to new for loop syntax 2023-02-18 19:17:21 -07:00
r00ster91
7350ea3e2d std.builtin: rename Type.Fn's args to params
This was a poor naming choice; these are parameters, not arguments.
Parameters specify what kind of arguments are expected, whereas the arguments are the actual values passed.
2022-12-17 14:11:33 +01:00
r00ster91
20d3fd901e std.builtin: rename Type.Fn.Param's arg_type to type
It's the type of a parameter, not an argument, but the prefix is redundant either way.
2022-12-17 14:11:33 +01:00
Andrew Kelley
954019983d std: add move() functions to hash maps 2022-12-04 15:57:40 -07:00
Andrew Kelley
d3d24874c9 std: remove deprecated API for the upcoming release
See #3811
2022-09-16 14:46:53 -04:00
Andrew Kelley
4f527e5d36 std: fix missing hash map safety
There was a missing compile error for calling ensureUnusedCapacity
without a Context in the case that the Context is non-void.
2022-04-20 17:18:06 -07:00
Andrew Kelley
b45c6c757c std.hash_map: workaround for circular dependency
See #11367

It's debatable whether this ends up being a legitimate compile error or
whether the lang spec allows this test case. For now this workaround
seems very reasonable; delaying comptime execution of `verifyContext`
until the struct is instantiated.
2022-04-01 00:17:02 -07:00
Robin Voetter
feb8981a95 gdb: add slice, multi array list, and array hash map printers 2022-03-16 15:50:03 +01:00
Veikka Tuominen
2682b41da5 make gpa.deinit work with stage2 2022-02-28 13:09:14 -07:00
Motiejus Jakštys
c03b733f09
std.HashMap: return explicit errors (#11000)
All errors from std.HashMap are allocation errors. Mark them as
such. This is helpful when one wants to return explicit errors where
HashMap is used.
2022-02-27 15:24:00 -05:00
chip2n
ab43c045ed Fix incorrect documentation for std.HashMap.remove() 2022-02-18 13:47:38 -05:00
Andrew Kelley
cf88cf2657 std: make ArrayHashMap eql function accept an additional param
which is the index of the key that already exists in the hash map.

This enables the use case of using `AutoArrayHashMap(void, void)` which
may seem surprising at first, but is actually pretty handy!
This commit includes a proof-of-concept of how I want to use it, with a
new InternArena abstraction for stage2 that provides a compact way to
store values (and types) in an "internment arena", thus making types
stored exactly once (per arena), representable with a single u32 as a
reference to a type within an InternArena, and comparable with a
simple u32 integer comparison. If both types are in the same
InternArena, you can check if they are equal by seeing if their index is
the same.

What's neat about `AutoArrayHashMap(void, void)` is that it allows us to
look up the indexes by key, *without actually storing the keys*.
Instead, keys are treated as ephemeral values that are constructed as
needed.

As a result, we have an extremely efficient encoding of types and
values, represented only by three arrays, which has no pointers, and can
therefore be serialized and deserialized by a single writev/readv call.
The `map` field is denormalized data and can be computed from the other
two fields.

This is in contrast to our current Type/Value system which makes
extensive use of pointers.

The test at the bottom of InternArena.zig passes in this commit.
2022-01-31 01:20:45 -07:00
riverbl
a0732117d0 HashMap: add removeByPtr 2022-01-24 20:29:05 +02:00
djg
4731a6e5d5
std: hash_map: optimize isFree/isTombstone (#10562)
- Add an `Metadata.isFree` helper method.
- Implement `Metadata.isTombstone` and `Metadata.isFree` with `@bitCast` then comparing to a constant. I assume `@bitCast`-then-compare is faster than the old method because it only involves one comparison, and doesn't require bitmasking.
- Summary of benchmarked changes (`gotta-go-fast`, run locally, compared to master):
  - 3/4 of the hash map benchmarks used ~10% fewer cycles
  - The last one (project Euler) shows 4% fewer cycles.
2022-01-10 23:54:45 -05:00
Andrew Kelley
6d04de706a Revert "std: optimize hash_map probe loop condition"
This reverts commit 11803a3a569205d640c7ec0b0aedba83f47a6e64.

Observations from the performance dashboard:
 * strictly worse in terms of CPU instructions
 * slightly worse wall time (but this can be noisy)
 * sometimes better, sometimes worse for branch predictions

Given that the commit was introducing complexity for optimization's
sake, these performance changes do not seem worth it.
2021-12-17 16:56:43 -07:00
sentientwaffle
11803a3a56 std: optimize hash_map probe loop condition
See https://github.com/ziglang/zig/pull/10337 for context.

In #10337 the `available` tracking fix necessitated an additional condition on the probe loop in both `getOrPut` and `getIndex` to prevent an infinite loop. Previously, this condition was implicit thanks to the guaranteed presence of a free slot.

The new condition hurts the `HashMap` benchmarks (https://github.com/ziglang/zig/pull/10337#issuecomment-996432758).

This commit removes that extra condition on the loop. Instead, when probing, first check whether the "home" slot is the target key — if so, return it. Otherwise, save the home slot's metadata to the stack and temporarily "free" the slot (but don't touch its value). Then continue with the original loop. Once again, the loop will be implicitly broken by the new "free" slot. The original metadata is restored before the function returns.

`getOrPut` has one additional gotcha — if the home slot is a tombstone and `getOrPut` misses, then the home slot is is written with the new key; that is, its original metadata (the tombstone) is not restored.

Other changes:

- Test hash map misses.
- Test using `getOrPutAssumeCapacity` to get keys at the end (along with `get`).
2021-12-17 15:21:41 -08:00
sentientwaffle
ef0566df78 std: count hash_map tombstones as available
When entries are inserted and removed into a hash map at an equivalent rate (maintaining a mostly-consistent total count of entries), it should never need to be resized. But `HashMapUnmanaged.available` does not presently count tombstoned slots as "available", so this put/remove pattern eventually panics (assertion failure) when `available` reaches `0`.

The solution implemented here is to count tombstoned slots as "available". Another approach (which hashbrown (b3eaf32e60/src/raw/mod.rs (L1455-L1542)) takes) would be to rehash all entries in place when there are too many tombstones. This is more complex but avoids an `O(n)` bad case when the hash map is full of many tombstones.
2021-12-16 19:11:53 -08:00
Lee Cannon
85de022c56
allocgate: std Allocator interface refactor 2021-11-30 23:32:47 +00:00
Andrew Kelley
902df103c6 std lib API deprecations for the upcoming 0.9.0 release
See #3811
2021-11-30 00:13:07 -07:00
Ominitay
c1a5ff34f3 std.rand: Refactor Random interface
These changes have been made to resolve issue #10037. The `Random`
interface was implemented in such a way that causes significant slowdown
when calling the `fill` function of the rng used.

The `Random` interface is no longer stored in a field of the rng, and is
instead returned by the child function `random()` of the rng. This
avoids the performance issues caused by the interface.
2021-10-27 16:07:48 -04:00
Ryan Liptak
59f5053bed Update all ensureCapacity calls to the relevant non-deprecated version 2021-09-19 13:52:56 +02:00
FnControlOption
fdf1918b39 std.hash_map: add StringIndexAdapter and StringIndexContext 2021-09-03 06:50:27 -07:00
Andrew Kelley
c05a20fc8c std: reorganization that allows new usingnamespace semantics
The proposal #9629 is now accepted, usingnamespace stays but no longer
puts identifiers in scope.
2021-09-01 17:54:06 -07:00
fn ⌃ ⌥
b25e58b0ac
std.hash_map: add getKey methods (#9607) 2021-08-31 00:32:34 -04:00
Andrew Kelley
d29871977f remove redundant license headers from zig standard library
We already have a LICENSE file that covers the Zig Standard Library. We
no longer need to remind everyone that the license is MIT in every single
file.

Previously this was introduced to clarify the situation for a fork of
Zig that made Zig's LICENSE file harder to find, and replaced it with
their own license that required annual payments to their company.
However that fork now appears to be dead. So there is no need to
reinforce the copyright notice in every single file.
2021-08-24 12:25:09 -07:00
Andrew Kelley
47f2463b5c std.HashMap: fix getPtrAdapted. AstGen: fix fn param iteration
There was a bug in stage2 regarding iteration of function parameter AST.
This resulted in a false negative "unused parameter" compile error,
which, when fixed, revealed a bug in the std lib HashMap implementation.
2021-08-05 23:17:29 -07:00
Isaac Freund
03156e5899 std/hash_map: fix ensureUnusedCapacity() over-allocating
Currently this function adds the desired unused capactiy to the current
total capacity instead of the current used capactiy.
2021-07-12 11:31:36 -07:00
Andrew Kelley
298a65ff4b std.HashMap: add ensureUnusedCapacity and ensureTotalCapacity
and deprecated ensureCapacity. This matches the pattern set by ArrayList
and ArrayHashMap already.
2021-07-07 00:38:10 -07:00
Jacob G-W
9fffffb07b fix code broken from previous commit 2021-06-21 17:03:03 -07:00
Jacob G-W
641ecc260f std, src, doc, test: remove unused variables 2021-06-21 17:03:03 -07:00
hadroncfy
1f29b75f08
HashMap.getOrPutAssumeCapacityAdapted should set key to undefined (#9138)
* std.hash_map.HashMap: getOrPutAssumeCapacityAdapted should set key to undefined

* add test for std.hash_map.HashMap.getOrPutAdapted
2021-06-18 08:52:30 +03:00
Andrew Kelley
138afd5cbf zig fmt 2021-06-10 20:13:43 -07:00
Martin Wickham
6953c8544b Fix return type of HashMap.getAdapted 2021-06-03 17:58:06 -05:00
Martin Wickham
fc9430f567 Breaking hash map changes for 0.8.0
- hash/eql functions moved into a Context object
- *Context functions pass an explicit context
- *Adapted functions pass specialized keys and contexts
- new getPtr() function returns a pointer to value
- remove functions renamed to fetchRemove
- new remove functions return bool
- removeAssertDiscard deleted, use assert(remove(...)) instead
- Keys and values are stored in separate arrays
- Entry is now {*K, *V}, the new KV is {K, V}
- BufSet/BufMap functions renamed to match other set/map types
- fixed iterating-while-modifying bug in src/link/C.zig
2021-06-03 17:02:16 -05:00
Andrew Kelley
597082adf4 Merge remote-tracking branch 'origin/master' into stage2-whole-file-astgen
Conflicts:
 * build.zig
 * src/Compilation.zig
 * src/codegen/spirv/spec.zig
 * src/link/SpirV.zig
 * test/stage2/darwin.zig
   - this one might be problematic; start.zig looks for `main` in the
     root source file, not `_main`. Not sure why there is an underscore
     there in master branch.
2021-05-15 21:44:38 -07:00
Sahnvour
2d4d4baa42 std.hash_map: use 7 bits of metadata instead of 6
we only effectively need 1 control bit to represent 2 special states for
the metadata (free and tombstone)
this should reduce the number of actual element equality tests, but since
it's very low already, the impact is negligible
2021-05-14 15:15:55 -04:00
Andrew Kelley
5619ce2406 Merge remote-tracking branch 'origin/master' into stage2-whole-file-astgen
Conflicts:
 * doc/langref.html.in
 * lib/std/enums.zig
 * lib/std/fmt.zig
 * lib/std/hash/auto_hash.zig
 * lib/std/math.zig
 * lib/std/mem.zig
 * lib/std/meta.zig
 * test/behavior/alignof.zig
 * test/behavior/bitcast.zig
 * test/behavior/bugs/1421.zig
 * test/behavior/cast.zig
 * test/behavior/ptrcast.zig
 * test/behavior/type_info.zig
 * test/behavior/vector.zig

Master branch added `try` to a bunch of testing function calls, and some
lines also had changed how to refer to the native architecture and other
`@import("builtin")` stuff.
2021-05-08 14:45:21 -07:00
Veikka Tuominen
fd77f2cfed std: update usage of std.testing 2021-05-08 15:15:30 +03:00
Andrew Kelley
429cd2b5dd std: change @import("builtin") to std.builtin 2021-04-15 19:06:39 -07:00