alessandro trinca tornidor commited on
Commit
a133169
1 Parent(s): ac226c8

test: add more test cases

Browse files
README.md CHANGED
@@ -48,6 +48,31 @@ I upgraded the old custom frontend (iQuery@3.7.1, Bootstrap@5.3.3) and backend (
48
  In case of missing TTS voices needed by the Text-to-Speech in-browser SpeechSynthesis feature (e.g. on Windows 11 you need to install manually the TTS voices for the languages you need), right now the Gradio frontend raises an alert message with a JavaScript message.
49
  In this case the TTS in-browser feature isn't usable and the users should use the backend TTS feature.
50
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51
  ### E2E tests with playwright
52
 
53
  Normally I use Visual Studio Code to write and execute my playwright tests, however it's always possible to run them from cli (from the `static` folder, using a node package manager like `npm` or `pnpm`):
 
48
  In case of missing TTS voices needed by the Text-to-Speech in-browser SpeechSynthesis feature (e.g. on Windows 11 you need to install manually the TTS voices for the languages you need), right now the Gradio frontend raises an alert message with a JavaScript message.
49
  In this case the TTS in-browser feature isn't usable and the users should use the backend TTS feature.
50
 
51
+ ## Python test cases (also enhanced with a mutation test suite)
52
+
53
+ After reaching a test coverage of 89%, I tried the [`cosmic-ray`](https://cosmic-ray.readthedocs.io/) [mutant test suite](https://en.wikipedia.org/wiki/Mutation_testing) and I found out that I missed some spots.
54
+ For this reason I started to improve my test cases (one module at time to avoid waiting too long):
55
+
56
+ ```bash
57
+ python .venv312/bin/cosmic-ray init cosmic_ray_config.toml cosmic_ray.sqlite
58
+ python .venv312/bin/cosmic-ray --verbosity=INFO baseline cosmic_ray_config.toml
59
+ python .venv312/bin/cosmic-ray exec cosmic_ray_config.toml cosmic_ray.sqlite
60
+ cr-html cosmic_ray.sqlite > tmp/cosmic-ray-speechtoscore.html
61
+ ```
62
+
63
+ The `cosmic_ray_config.toml` I'm using now (the tests for the `lambdaSpeechToScore` module are in two different files to avoid too code in only one):
64
+
65
+ ```toml
66
+ [cosmic-ray]
67
+ module-path = "aip_trainer/lambdas/lambdaSpeechToScore.py"
68
+ timeout = 30.0
69
+ excluded-modules = []
70
+ test-command = "python -m pytest tests/lambdas/test_lambdaSpeechToScore.py tests/lambdas/test_lambdaSpeechToScore_librosa.py"
71
+
72
+ [cosmic-ray.distributor]
73
+ name = "local"
74
+ ```
75
+
76
  ### E2E tests with playwright
77
 
78
  Normally I use Visual Studio Code to write and execute my playwright tests, however it's always possible to run them from cli (from the `static` folder, using a node package manager like `npm` or `pnpm`):
cosmic_ray_config.toml ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ [cosmic-ray]
2
+ module-path = "aip_trainer/lambdas/lambdaSpeechToScore.py"
3
+ timeout = 30.0
4
+ excluded-modules = []
5
+ test-command = "python -m pytest tests/lambdas/test_lambdaSpeechToScore.py tests/lambdas/test_lambdaSpeechToScore_librosa.py"
6
+
7
+ [cosmic-ray.distributor]
8
+ name = "local"
tests/events/test_float_buffer.json ADDED
@@ -0,0 +1 @@
 
 
1
+ [0.0, -0.47265625, 0.0, -0.472686767578125, 0.0, -0.47271728515625, 0.0, -0.472747802734375, 0.0, -0.4727783203125, 0.0, -0.472808837890625, 0.0, -0.47283935546875, 0.0, -0.472869873046875, 0.0, -0.472900390625, 0.0, -0.472930908203125, 0.0, -0.47296142578125, 0.0, -0.472991943359375, 0.0, -0.4730224609375, 0.0, -0.473052978515625, 0.0, -0.47308349609375, 0.0, -0.473114013671875, 0.0, -0.47314453125, 0.0, -0.473175048828125, 0.0, -0.47320556640625, 0.0, -0.473236083984375, 0.0, -0.4732666015625, 0.0, -0.473297119140625, 0.0, -0.47332763671875, 0.0, -0.473358154296875, 0.0, -0.473388671875, 0.0, -0.473419189453125, 0.0, -0.47344970703125, 0.0, -0.473480224609375, 0.0, -0.4735107421875, 0.0, -0.473541259765625, 0.0, -0.47357177734375, 0.0, -0.473602294921875, 0.0, -0.4736328125, 0.0, -0.473663330078125, 0.0, -0.47369384765625, 0.0, -0.473724365234375, 0.0, -0.4737548828125, 0.0, -0.473785400390625, 0.0, -0.47381591796875, 0.0, -0.473846435546875, 0.0, -0.473876953125, 0.0, -0.473907470703125, 0.0, -0.47393798828125, 0.0, -0.473968505859375, 0.0, -0.4739990234375, 0.0, -0.474029541015625, 0.0, -0.47406005859375, 0.0, -0.474090576171875, 0.0, -0.47412109375, 0.0, -0.474151611328125, 0.0, -0.47418212890625, 0.0, -0.474212646484375, 0.0, -0.4742431640625, 0.0, -0.474273681640625, 0.0, -0.47430419921875, 0.0, -0.474334716796875, 0.0, -0.474365234375, 0.0, -0.474395751953125, 0.0, -0.47442626953125, 0.0, -0.474456787109375, 0.0, -0.4744873046875, 0.0, -0.474517822265625, 0.0, -0.47454833984375, 0.0, -0.474578857421875, 0.0, -0.474609375, 0.0, -0.474639892578125, 0.0, -0.47467041015625, 0.0, -0.474700927734375, 0.0, -0.4747314453125, 0.0, -0.474761962890625, 0.0, -0.47479248046875, 0.0, -0.474822998046875, 0.0, -0.474853515625, 0.0, -0.474884033203125, 0.0, -0.47491455078125, 0.0, -0.474945068359375, 0.0, -0.4749755859375, 0.0, -0.475006103515625, 0.0, -0.47503662109375, 0.0, -0.475067138671875, 0.0, -0.47509765625, 0.0, -0.475128173828125, 0.0, -0.47515869140625, 0.0, -0.475189208984375, 0.0, -0.4752197265625, 0.0, -0.475250244140625, 0.0, -0.47528076171875, 0.0, -0.475311279296875, 0.0, -0.475341796875, 0.0, -0.475372314453125, 0.0, -0.47540283203125, 0.0, -0.475433349609375, 0.0, -0.4754638671875, 0.0, -0.475494384765625, 0.0, -0.47552490234375, 0.0, -0.475555419921875, 0.0, -0.4755859375, 0.0, -0.475616455078125, 0.0, -0.47564697265625, 0.0, -0.475677490234375, 0.0, -0.4757080078125, 0.0, -0.475738525390625, 0.0, -0.47576904296875, 0.0, -0.475799560546875, 0.0, -0.475830078125, 0.0, -0.475860595703125, 0.0, -0.47589111328125, 0.0, -0.475921630859375, 0.0, -0.4759521484375, 0.0, -0.475982666015625, 0.0, -0.47601318359375, 0.0, -0.476043701171875, 0.0, -0.47607421875, 0.0, -0.476104736328125, 0.0, -0.47613525390625, 0.0, -0.476165771484375, 0.0, -0.4761962890625, 0.0, -0.476226806640625, 0.0, -0.47625732421875, 0.0, -0.476287841796875, 0.0, -0.476318359375, 0.0, -0.476348876953125, 0.0, -0.47637939453125, 0.0, -0.476409912109375, 0.0, -0.4764404296875, 0.0, -0.476470947265625, 0.0, -0.47650146484375, 0.0, -0.476531982421875, 0.0, -0.4765625, 0.0, -0.47662353515625, 0.0, -0.4766845703125, 0.0, -0.47674560546875, 0.0, -0.476806640625, 0.0, -0.47686767578125, 0.0, -0.4769287109375, 0.0, -0.47698974609375, 0.0, -0.47705078125, 0.0, -0.47711181640625, 0.0, -0.4771728515625, 0.0, -0.47723388671875, 0.0, -0.477294921875, 0.0, -0.47735595703125, 0.0, -0.4774169921875, 0.0, -0.47747802734375, 0.0, -0.4775390625, 0.0, -0.47760009765625, 0.0, -0.4776611328125, 0.0, -0.47772216796875, 0.0, -0.477783203125, 0.0, -0.47784423828125, 0.0, -0.4779052734375, 0.0, -0.47796630859375, 0.0, -0.47802734375, 0.0, -0.47808837890625, 0.0, -0.4781494140625, 0.0, -0.47821044921875, 0.0, -0.478271484375, 0.0, -0.47833251953125, 0.0, -0.4783935546875, 0.0, -0.47845458984375, 0.0, -0.478515625, 0.0, -0.47857666015625, 0.0, -0.4786376953125, 0.0, -0.47869873046875, 0.0, -0.478759765625, 0.0, -0.47882080078125, 0.0, -0.4788818359375, 0.0, -0.47894287109375, 0.0, -0.47900390625, 0.0, -0.47906494140625, 0.0, -0.4791259765625, 0.0, -0.47918701171875, 0.0, -0.479248046875, 0.0, -0.47930908203125, 0.0, -0.4793701171875, 0.0, -0.47943115234375, 0.0, -0.4794921875, 0.0, -0.47955322265625, 0.0, -0.4796142578125, 0.0, -0.47967529296875, 0.0, -0.479736328125, 0.0, -0.47979736328125, 0.0, -0.4798583984375, 0.0, -0.47991943359375, 0.0, -0.47998046875, 0.0, -0.48004150390625, 0.0, -0.4801025390625, 0.0, -0.48016357421875, 0.0, -0.480224609375, 0.0, -0.48028564453125, 0.0, -0.4803466796875, 0.0, -0.48040771484375, 0.0, -0.48046875, 0.0, -0.4805908203125, 0.0, -0.480712890625, 0.0, -0.4808349609375, 0.0, -0.48095703125, 0.0, -0.4810791015625, 0.0, -0.481201171875, 0.0, -0.4813232421875, 0.0, -0.4814453125, 0.0, -0.4815673828125, 0.0, -0.481689453125, 0.0, -0.4818115234375, 0.0, -0.48193359375, 0.0, -0.4820556640625, 0.0, -0.482177734375, 0.0, -0.4822998046875, 0.0, -0.482421875, 0.0, -0.4825439453125, 0.0, -0.482666015625, 0.0, -0.4827880859375, 0.0, -0.48291015625, 0.0, -0.4830322265625, 0.0, -0.483154296875, 0.0, -0.4832763671875, 0.0, -0.4833984375, 0.0, -0.4835205078125, 0.0, -0.483642578125, 0.0, -0.4837646484375, 0.0, -0.48388671875, 0.0, -0.4840087890625, 0.0, -0.484130859375, 0.0, -0.4842529296875, 0.0, -0.484375, 0.0, -0.484619140625, 0.0, -0.48486328125, 0.0, -0.485107421875, 0.0, -0.4853515625, 0.0, -0.485595703125, 0.0, -0.48583984375, 0.0, -0.486083984375, 0.0, -0.486328125, 0.0, -0.486572265625, 0.0, -0.48681640625, 0.0, -0.487060546875, 0.0, -0.4873046875, 0.0, -0.487548828125, 0.0, -0.48779296875, 0.0, -0.488037109375, 0.0, -0.48828125, 0.0, -0.48876953125, 0.0, -0.4892578125, 0.0, -0.48974609375, 0.0, -0.490234375, 0.0, -0.49072265625, 0.0, -0.4912109375, 0.0, -0.49169921875, 0.0, -0.4921875, 0.0, -0.4931640625, 0.0, -0.494140625, 0.0, -0.4951171875, 0.0, -0.49609375, 0.0, -0.498046875, 0.0, -0.5, 0.0, -0.50390625, 0.0, 0.0, 0.0, 0.49609375, 0.0, 0.5, 0.0, 0.501953125, 0.0, 0.50390625, 0.0, 0.5048828125, 0.0, 0.505859375, 0.0, 0.5068359375, 0.0, 0.5078125, 0.0, 0.50830078125, 0.0, 0.5087890625, 0.0, 0.50927734375, 0.0, 0.509765625, 0.0, 0.51025390625, 0.0, 0.5107421875, 0.0, 0.51123046875, 0.0, 0.51171875, 0.0, 0.511962890625, 0.0, 0.51220703125, 0.0, 0.512451171875, 0.0, 0.5126953125, 0.0, 0.512939453125, 0.0, 0.51318359375, 0.0, 0.513427734375, 0.0, 0.513671875, 0.0, 0.513916015625, 0.0, 0.51416015625, 0.0, 0.514404296875, 0.0, 0.5146484375, 0.0, 0.514892578125, 0.0, 0.51513671875, 0.0, 0.515380859375, 0.0, 0.515625, 0.0, 0.5157470703125, 0.0, 0.515869140625, 0.0, 0.5159912109375, 0.0, 0.51611328125, 0.0, 0.5162353515625, 0.0, 0.516357421875, 0.0, 0.5164794921875, 0.0, 0.5166015625, 0.0, 0.5167236328125, 0.0, 0.516845703125, 0.0, 0.5169677734375, 0.0, 0.51708984375, 0.0, 0.5172119140625, 0.0, 0.517333984375, 0.0, 0.5174560546875, 0.0, 0.517578125, 0.0, 0.5177001953125, 0.0, 0.517822265625, 0.0, 0.5179443359375, 0.0, 0.51806640625, 0.0, 0.5181884765625, 0.0, 0.518310546875, 0.0, 0.5184326171875, 0.0, 0.5185546875, 0.0, 0.5186767578125, 0.0, 0.518798828125, 0.0, 0.5189208984375, 0.0, 0.51904296875, 0.0, 0.5191650390625, 0.0, 0.519287109375, 0.0, 0.5194091796875, 0.0, 0.51953125, 0.0, 0.51959228515625, 0.0, 0.5196533203125, 0.0, 0.51971435546875, 0.0, 0.519775390625, 0.0, 0.51983642578125, 0.0, 0.5198974609375, 0.0, 0.51995849609375, 0.0, 0.52001953125, 0.0, 0.52008056640625, 0.0, 0.5201416015625, 0.0, 0.52020263671875, 0.0, 0.520263671875, 0.0, 0.52032470703125, 0.0, 0.5203857421875, 0.0, 0.52044677734375, 0.0, 0.5205078125, 0.0, 0.52056884765625, 0.0, 0.5206298828125, 0.0, 0.52069091796875, 0.0, 0.520751953125, 0.0, 0.52081298828125, 0.0, 0.5208740234375, 0.0, 0.52093505859375, 0.0, 0.52099609375, 0.0, 0.52105712890625, 0.0, 0.5211181640625, 0.0, 0.52117919921875, 0.0, 0.521240234375, 0.0, 0.52130126953125, 0.0, 0.5213623046875, 0.0, 0.52142333984375, 0.0, 0.521484375, 0.0, 0.52154541015625, 0.0, 0.5216064453125, 0.0, 0.52166748046875, 0.0, 0.521728515625, 0.0, 0.52178955078125, 0.0, 0.5218505859375, 0.0, 0.52191162109375, 0.0, 0.52197265625, 0.0, 0.52203369140625, 0.0, 0.5220947265625, 0.0, 0.52215576171875, 0.0, 0.522216796875, 0.0, 0.52227783203125, 0.0, 0.5223388671875, 0.0, 0.52239990234375, 0.0, 0.5224609375, 0.0, 0.52252197265625, 0.0, 0.5225830078125, 0.0, 0.52264404296875, 0.0, 0.522705078125, 0.0, 0.52276611328125, 0.0, 0.5228271484375, 0.0, 0.52288818359375, 0.0, 0.52294921875, 0.0, 0.52301025390625, 0.0, 0.5230712890625, 0.0, 0.52313232421875, 0.0, 0.523193359375, 0.0, 0.52325439453125, 0.0, 0.5233154296875, 0.0, 0.52337646484375, 0.0, 0.5234375, 0.0, 0.523468017578125, 0.0, 0.52349853515625, 0.0, 0.523529052734375, 0.0, 0.5235595703125, 0.0, 0.523590087890625, 0.0, 0.52362060546875, 0.0, 0.523651123046875, 0.0, 0.523681640625, 0.0, 0.523712158203125, 0.0, 0.52374267578125, 0.0, 0.523773193359375, 0.0, 0.5238037109375, 0.0, 0.523834228515625, 0.0, 0.52386474609375, 0.0, 0.523895263671875, 0.0, 0.52392578125, 0.0, 0.523956298828125, 0.0, 0.52398681640625, 0.0, 0.524017333984375, 0.0, 0.5240478515625, 0.0, 0.524078369140625, 0.0, 0.52410888671875, 0.0, 0.524139404296875, 0.0, 0.524169921875, 0.0, 0.524200439453125, 0.0, 0.52423095703125, 0.0, 0.524261474609375, 0.0, 0.5242919921875, 0.0, 0.524322509765625, 0.0, 0.52435302734375, 0.0, 0.524383544921875, 0.0, 0.5244140625, 0.0, 0.524444580078125, 0.0, 0.52447509765625, 0.0, 0.524505615234375, 0.0, 0.5245361328125, 0.0, 0.524566650390625, 0.0, 0.52459716796875, 0.0, 0.524627685546875, 0.0, 0.524658203125, 0.0, 0.524688720703125, 0.0, 0.52471923828125, 0.0, 0.524749755859375, 0.0, 0.5247802734375, 0.0, 0.524810791015625, 0.0, 0.52484130859375, 0.0, 0.524871826171875, 0.0, 0.52490234375, 0.0, 0.524932861328125, 0.0, 0.52496337890625, 0.0, 0.524993896484375, 0.0, 0.5250244140625, 0.0, 0.525054931640625, 0.0, 0.52508544921875, 0.0, 0.525115966796875, 0.0, 0.525146484375, 0.0, 0.525177001953125, 0.0, 0.52520751953125, 0.0, 0.525238037109375, 0.0, 0.5252685546875, 0.0, 0.525299072265625, 0.0, 0.52532958984375, 0.0, 0.525360107421875, 0.0, 0.525390625, 0.0, 0.525421142578125, 0.0, 0.52545166015625, 0.0, 0.525482177734375, 0.0, 0.5255126953125, 0.0, 0.525543212890625, 0.0, 0.52557373046875, 0.0, 0.525604248046875, 0.0, 0.525634765625, 0.0, 0.525665283203125, 0.0, 0.52569580078125, 0.0, 0.525726318359375, 0.0, 0.5257568359375, 0.0, 0.525787353515625, 0.0, 0.52581787109375, 0.0, 0.525848388671875, 0.0, 0.52587890625, 0.0, 0.525909423828125, 0.0, 0.52593994140625, 0.0, 0.525970458984375, 0.0, 0.5260009765625, 0.0, 0.526031494140625, 0.0, 0.52606201171875, 0.0, 0.526092529296875, 0.0, 0.526123046875, 0.0, 0.526153564453125, 0.0, 0.52618408203125, 0.0, 0.526214599609375, 0.0, 0.5262451171875, 0.0, 0.526275634765625, 0.0, 0.52630615234375, 0.0, 0.526336669921875, 0.0, 0.5263671875, 0.0, 0.526397705078125, 0.0, 0.52642822265625, 0.0, 0.526458740234375, 0.0, 0.5264892578125, 0.0, 0.526519775390625, 0.0, 0.52655029296875, 0.0, 0.526580810546875, 0.0, 0.526611328125, 0.0, 0.526641845703125, 0.0, 0.52667236328125, 0.0, 0.526702880859375, 0.0, 0.5267333984375, 0.0, 0.526763916015625, 0.0, 0.52679443359375, 0.0, 0.526824951171875, 0.0, 0.52685546875, 0.0, 0.526885986328125, 0.0, 0.52691650390625, 0.0, 0.526947021484375, 0.0, 0.5269775390625, 0.0, 0.527008056640625, 0.0, 0.52703857421875, 0.0, 0.527069091796875, 0.0, 0.527099609375, 0.0, 0.527130126953125, 0.0, 0.52716064453125, 0.0, 0.527191162109375, 0.0, 0.5272216796875, 0.0, 0.527252197265625, 0.0, 0.52728271484375, 0.0, 0.527313232421875]
tests/lambdas/test_lambdaSpeechToScore_librosa.py CHANGED
@@ -1,5 +1,7 @@
1
  import unittest
2
 
 
 
3
  from aip_trainer.lambdas import lambdaSpeechToScore
4
  from aip_trainer.utils.utilities import hash_calculate
5
  from tests import EVENTS_FOLDER
@@ -45,7 +47,7 @@ class TestCalcStartEnd(unittest.TestCase):
45
  self.assertEqual(output, 48000 * 4)
46
 
47
 
48
- class TestAudioReadLoad(unittest.TestCase):
49
 
50
  def test_audioread_load_full_file(self):
51
  signal, sr_native = lambdaSpeechToScore.audioread_load(input_file_test_de)
@@ -92,5 +94,54 @@ class TestAudioReadLoad(unittest.TestCase):
92
  self.assertEqual(hash_output, b'47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU=')
93
 
94
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
95
  if __name__ == "__main__":
96
  unittest.main()
 
1
  import unittest
2
 
3
+ import numpy as np
4
+
5
  from aip_trainer.lambdas import lambdaSpeechToScore
6
  from aip_trainer.utils.utilities import hash_calculate
7
  from tests import EVENTS_FOLDER
 
47
  self.assertEqual(output, 48000 * 4)
48
 
49
 
50
+ # class TestAudioReadLoad(unittest.TestCase):
51
 
52
  def test_audioread_load_full_file(self):
53
  signal, sr_native = lambdaSpeechToScore.audioread_load(input_file_test_de)
 
94
  self.assertEqual(hash_output, b'47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU=')
95
 
96
 
97
+ class TestBufToFloat(unittest.TestCase):
98
+ def test_buf_to_float_2_bytes(self):
99
+ int_buffer = np.array([0, 32767, -32768], dtype=np.int16).tobytes()
100
+ expected_output = np.array([0.0, 1.0, -1.0], dtype=np.float32)
101
+ output = lambdaSpeechToScore.buf_to_float(int_buffer, n_bytes=2, dtype=np.float32)
102
+ np.testing.assert_array_almost_equal(output, expected_output, decimal=3)
103
+
104
+ def test_buf_to_float_1_byte(self):
105
+ int_buffer = np.array([0, 127, -128], dtype=np.int8).tobytes()
106
+ expected_output = np.array([0.0, 0.9921875, -1.0], dtype=np.float32)
107
+ output = lambdaSpeechToScore.buf_to_float(int_buffer, n_bytes=1, dtype=np.float32)
108
+ np.testing.assert_array_almost_equal(output, expected_output, decimal=3)
109
+
110
+ def test_buf_to_float_4_bytes(self):
111
+ int_buffer = np.array([0, 2147483647, -2147483648], dtype=np.int32).tobytes()
112
+ expected_output = np.array([0.0, 1.0, -1.0], dtype=np.float32)
113
+ output = lambdaSpeechToScore.buf_to_float(int_buffer, n_bytes=4, dtype=np.float32)
114
+ np.testing.assert_array_almost_equal(output, expected_output, decimal=3)
115
+
116
+ def test_buf_to_float_custom_dtype(self):
117
+ int_buffer = np.array([0, 32767, -32768], dtype=np.int16).tobytes()
118
+ expected_output = np.array([0.0, 0.999969482421875, -1.0], dtype=np.float64)
119
+ output = lambdaSpeechToScore.buf_to_float(int_buffer, n_bytes=2, dtype=np.float64)
120
+ np.testing.assert_array_almost_equal(output, expected_output, decimal=3)
121
+
122
+ def test_buf_to_float_empty_buffer(self):
123
+ int_buffer = np.array([], dtype=np.int16).tobytes()
124
+ expected_output = np.array([], dtype=np.float32)
125
+ output = lambdaSpeechToScore.buf_to_float(int_buffer, n_bytes=2, dtype=np.float32)
126
+ np.testing.assert_array_almost_equal(output, expected_output, decimal=3)
127
+
128
+ def test_buf_to_float_512_bytes(self):
129
+ import json
130
+
131
+ float_arr = np.arange(-256, 256, dtype=np.float32)
132
+ float_buffer = float_arr.tobytes()
133
+ output = lambdaSpeechToScore.buf_to_float(float_buffer, dtype=np.float32) # default n_bytes=2
134
+ hash_output = hash_calculate(output, is_file=False)
135
+ # serialized = serialize.serialize(output)
136
+ # with open(EVENTS_FOLDER / "test_float_buffer.json", "w") as f:
137
+ # json.dump(serialized, f)
138
+ with open(EVENTS_FOLDER / "test_float_buffer.json", "r") as f:
139
+ expected = f.read()
140
+ expected_output = np.asarray(json.loads(expected), dtype=np.float32)
141
+ hash_expected_output = hash_calculate(expected_output, is_file=False)
142
+ assert hash_output == hash_expected_output
143
+ np.testing.assert_array_almost_equal(output, expected_output)
144
+
145
+
146
  if __name__ == "__main__":
147
  unittest.main()