hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ce66dc84270570bd687b915513c26576dd47eeec | 68 | md | Markdown | .github/ISSUE_TEMPLATE.md | ldez/atom-autocomplete-asciidoc | 3040d1e6ce56e54501c30cac360715a021eadfdf | [
"MIT"
] | null | null | null | .github/ISSUE_TEMPLATE.md | ldez/atom-autocomplete-asciidoc | 3040d1e6ce56e54501c30cac360715a021eadfdf | [
"MIT"
] | null | null | null | .github/ISSUE_TEMPLATE.md | ldez/atom-autocomplete-asciidoc | 3040d1e6ce56e54501c30cac360715a021eadfdf | [
"MIT"
] | null | null | null | ## Description
Your description.
## Screenshots
Your screenshot.
| 8.5 | 17 | 0.75 | eng_Latn | 0.558182 |
ce67a99fa92360b09643f98e4aa6518ce45b407d | 2,292 | md | Markdown | docs/Way of Working/Release-Steps_Entity.md | ochiu/entity | 60f5851d4cd52a0607f7c6329a5c6d14d782f1a4 | [
"Apache-2.0"
] | 1 | 2020-05-21T18:08:00.000Z | 2020-05-21T18:08:00.000Z | docs/Way of Working/Release-Steps_Entity.md | ochiu/entity | 60f5851d4cd52a0607f7c6329a5c6d14d782f1a4 | [
"Apache-2.0"
] | null | null | null | docs/Way of Working/Release-Steps_Entity.md | ochiu/entity | 60f5851d4cd52a0607f7c6329a5c6d14d782f1a4 | [
"Apache-2.0"
] | null | null | null | - Note: This release notes are currently used by Entity team only.
- We work with Relationships team to document/ update their release process. Here an example:
https://app.zenhub.com/workspaces/entity-5bf2f2164b5806bc2bf60531/issues/bcgov/entity/2485
**Prior to moving into production, the following steps should be completed or confirmed:**
- [ ] Update version number in the code being released
- [ ] Ensure deployment steps are filled in - remove/replace placeholders
- [ ] Create a draft release in GitHub and confirm the correct commits are present
- [ ] Add version # and release # to this ticket
- [ ] Dev to send commit list to QA (or otherwise publish changelog)
- commits are in the draft release for <Release Number>
- [ ] QA to schedule the release with staff/ clients (daytime's best or when staff are available for rollback)
- [ ] All dev/test pipeline test suites green
- [ ] Dev/ QA chat to plan prod verification testing (unless already automated)
Release the code to production and complete smoke test (STEPS BELOW)
Finalise/publish the release in GitHub, tagging it
Merge release branch back to master (if applicable)
- [ ] Change openshift builds/pipelines back to master (if applicable)
- [ ] colin-api-base
- [ ] legal-api
- [ ] update-colin-filings
- [ ] update-legal-filings
- [ ] entity-filer
- [ ] entity- pay
----
**Deployment Steps**
**DEV:**
- [ ] config map changes
- [ ] are there any dependencies, such as an auth/pay deployment or keycloak changes?
- [ ] one-time steps to be done:
- [ ] rebuild colin-updater
- [ ] rebuild legal-updater
- [ ] dev verification
- [ ] QA verification
**TEST:**
- [ ] one-time steps to be done:
- [ ] rebuild colin-updater
- [ ] rebuild legal-updater
- [ ] DEV verification
- [ ] QA verification
**PROD:**
- [ ] link colin event ids 102132378 and 102132380 to filing_id = 251 before running colin-updater <This step might be specific to a particular release>
- [ ] one-time steps to be done:
- [ ] build colin-updater
- [ ] rebuild legal-updater
**Notify PO and CM**
- [ ] Send the PO a message in RocketChat if a release was successfully implemented
- [ ] PO will send Trish a message if a release was successfully implemented: Trish.Reimer@gov.bc.ca - and karla.maria.ramirez@gov.bc.ca
| 39.517241 | 152 | 0.719895 | eng_Latn | 0.992217 |
ce6828b928765857f84c2565111f72538a5e9785 | 494 | md | Markdown | Glorp.package/InfixFunction.class/README.md | akgrant43/Glorp | fbd722ec0f0f6bffb6a4a2ee9f09186d8445c6fb | [
"MIT"
] | null | null | null | Glorp.package/InfixFunction.class/README.md | akgrant43/Glorp | fbd722ec0f0f6bffb6a4a2ee9f09186d8445c6fb | [
"MIT"
] | null | null | null | Glorp.package/InfixFunction.class/README.md | akgrant43/Glorp | fbd722ec0f0f6bffb6a4a2ee9f09186d8445c6fb | [
"MIT"
] | null | null | null | This is a function that is infix. That is, it prints its name in between its arguments. For example, + for string concatenation, || for logical OR.
Instance Variables:
arguments <Collection of GlorpExpression> Our post-arguments (the first one being the base).
functionParts <Array of: String> The parts of our name. For a two-argument function, this is just the same as an array containing the name, but for one with more (e.g. BETWEEN AND) it contains the different portions of the name.
| 82.333333 | 229 | 0.771255 | eng_Latn | 0.999552 |
ce69444bf41eecc2ffa2bfdfb92db11edcdc293e | 1,157 | md | Markdown | SIP_2021 - Task4 - Numpy, OpenCV and PIL/README.md | Harshal-Atmaramani/SummerInternshipProgram2021 | 9f8c936817f7daed3190ccd6ff96c5e6ab91ad2e | [
"MIT"
] | 1 | 2021-06-09T07:06:56.000Z | 2021-06-09T07:06:56.000Z | SIP_2021 - Task4 - Numpy, OpenCV and PIL/README.md | Harshal-Atmaramani/SummerInternshipProgram2021 | 9f8c936817f7daed3190ccd6ff96c5e6ab91ad2e | [
"MIT"
] | null | null | null | SIP_2021 - Task4 - Numpy, OpenCV and PIL/README.md | Harshal-Atmaramani/SummerInternshipProgram2021 | 9f8c936817f7daed3190ccd6ff96c5e6ab91ad2e | [
"MIT"
] | 1 | 2021-08-24T04:45:22.000Z | 2021-08-24T04:45:22.000Z | ### This task is one of many tasks that were given to summer interns at Linuxworld Informatics Pvt Ltd by our belowed Mr. Vimal Daga Sir.
#### The Task Description 📄 is as follows:
Summer - Task 04 👨🏻💻
⚜️ Team Task
🔅 Task 4.1
📌 Create image by yourself using Python Code.
🔅 Task 4.2
📌 Take 2 image crop some part of both image and swap it.
🔅 Task 4.3
📌 Take 2 image (we have used 4) and combine it to form single image.
For example collage.
### We have used Google Colaboratory to do the task since it is collaborative and uses cloud resources to process the code. If you intend to use the code in this repository, kindly make sure either to use Google Colaboratory or make necessary changes so as to use in Jupyter Notebook.
### Some point of differences between Google Colaboratory and Jupyter Notebook are listed below:
1. from google.colab.patches import cv2_imshow is equal to import cv2 in Jupyter NB.
2. cv2_imread in Colab is equal to cv2.imread in Jupyter NB.
3. cv2_imshow in Colab is equal to three statements in Jupyter NB.
cv2.imshow('Img_Name', arr_var)
cv2.waitKey()
cv2.destroyAllWindows()
| 39.896552 | 285 | 0.73293 | eng_Latn | 0.99844 |
ce6989b2edf59de40bd07c6d926927006f45e632 | 9,971 | md | Markdown | docs/Reference/Collection/CollectionManipulation.md | jamesm-macrometa/jsC8 | 80f1234b5460ae8f885d2706ac854eb67382221b | [
"Apache-2.0"
] | 15 | 2019-04-29T10:28:06.000Z | 2022-03-20T09:31:43.000Z | docs/Reference/Collection/CollectionManipulation.md | jamesm-macrometa/jsC8 | 80f1234b5460ae8f885d2706ac854eb67382221b | [
"Apache-2.0"
] | 10 | 2020-08-17T12:38:35.000Z | 2022-03-12T12:34:01.000Z | docs/Reference/Collection/CollectionManipulation.md | jamesm-macrometa/jsC8 | 80f1234b5460ae8f885d2706ac854eb67382221b | [
"Apache-2.0"
] | 9 | 2019-04-12T06:27:43.000Z | 2022-01-19T22:58:53.000Z | ## Manipulating the collection
These functions implement [the HTTP API for modifying collections](https://developer.document360.io/docs/using-c8-rest-api)
## client.createCollection
`async client.createCollection(collectionName, [properties], [isEdge]): Object`
Creates collection
**Arguments**
- **collectionName**: `string`
Name of the collection
- **properties**: `Object` (optional)
For more information on the 'properties` object, see [the HTTP API documentation for creating collections](https://developer.document360.io/docs/using-c8-rest-api).
- **isEdge**: `boolean` (optional)
If yes then it will create an Edge Collection. Default is false.
Note:- If this prop is provided then no need to pass type in properties object.
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
await client.createCollection('some-collection');
```
## client.deleteCollection
`async client.deleteCollection(collectionName, [opts]): Object`
Deletes collection
**Arguments**
- **collectionName**: `string`
Name of the collection
- **opts**: `Object` (optional)
An object with the following properties:
- **isSystem**: `boolean` (Default: `false`)
Whether the collection should be dropped even if it is a system collection.
This parameter must be set to `true` when dropping a system collection.
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
await client.deleteCollection('some-collection');
```
## client.hasCollection
`async client.hasCollection(collectionName): Boolean`
Returns true if collection exists otherwise false
**Arguments**
- **collectionName**: `string`
Name of the collection
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const hasCollection = await client.hasCollection('some-collection');
```
## client.getCollection
`async client.getCollection(collectionName): Object`
Returns collection info
**Arguments**
- **collectionName**: `string`
Name of the collection
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collection = await client.getCollection('some-collection');
```
## client.getCollections
`async client.getCollections(collectionName): Array<Object>`
Returns collections info
**Arguments**
- **collectionName**: `string`
Name of the collection
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collections = await client.getCollections('some-collection');
```
## client.getCollectionIds
`async client.getCollectionIds(collectionName): Array<Object>`
Returns collection Ids
**Arguments**
- **collectionName**: `string`
Name of the collection
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collectionIds = await client.getCollectionIds('some-collection');
```
## client.getCollectionKeys
`async client.getCollectionKeys(collectionName): Array<Object>`
Returns collections keys
**Arguments**
- **collectionName**: `string`
Name of the collection
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collectionKeys = await client.getCollectionKeys('some-collection');
```
## client.onCollectionChange
`async client.onCollectionChange(collectionName, [subscriptionName], [dcName]): void`
**Arguments**
- **collectionName**: `string`
Name of the collection
- **dcName**: `string` (optional)
The dcName for the consumer.
- **subscriptionName**: `string` (optional)
The name of the subscription.
**Methods**
`listener.on('open', callback )`
`listener.on('message', callback )`
`listener.on('close', callback )`
`listener.on('error', callback )`
`listener.close()`
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const listener = await client.onCollectionChange("some-collection");
listener.on('message',(msg) => console.log("message=>", msg));
listener.on('open',() => console.log("connection open"));
listener.on('close',() => console.log("connection closed"));
```
## client.enableCollectionStream
`async client.enableCollectionStream(collectionName, enableStream): Object`
Updates the collection stream flag.
**Arguments**
- **collectionName**: `string`
Name of the collection
- **enableStream**: `boolean`
Whether the stream should be enabled on the collection or not.
This parameter must be set to `true` when enabling a stream on the collection.
**NOTE**: You can't set the flag to `false` as on-demand stream deletion is not allowed.
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collection = await client.enableCollectionStream('some-collection', enableStream);
```
# Advanced User
## collection.create
`async collection.create([properties]): Object`
Creates a collection with the given `properties` for this collection's name, then returns the server response.
**Arguments**
- **properties**: `Object` (optional)
For more information on the 'properties` object, see [the HTTP API documentation for creating collections](https://developer.document360.io/docs/using-c8-rest-api).
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collection = client.collection('potatoes');
await collection.create()
// the document collection "potatoes" now exists
// -- or --
const collection = client.edgeCollection('friends');
await collection.create({
waitForSync: true // always sync document changes to disk
});
// the edge collection "friends" now exists
```
Note:-(To make the collection as `spot`, pass the `isSpot: true` in the `properties` object.)
## collection.onChange
`collection.onChange(dcName, subscriptionName): void`
**Arguments**
- **dcName**: `string``
The dcName for the consumer.
- **subscriptionName**: `string`
The name of the subscription.
**Methods**
`listener.on('open', callback )`
`listener.on('message', callback )`
`listener.on('close', callback )`
`listener.on('error', callback )`
`listener.close()`
**Examples**
```js
const listener = await collection.onChange("fed.macrometa.io", "mySub");
listener.on('message',(msg) => console.log("message=>", msg));
listener.on('open',() => console.log("connection open"));
listener.on('close',() => console.log("connection closed"));
```
## collection.rename
`async collection.rename(name): Object`
Renames the collection. The `Collection` instance will automatically update its name when the rename succeeds.
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collection = client.collection('some-collection');
const result = await collection.rename('new-collection-name')
assert.equal(result.name, 'new-collection-name');
assert.equal(collection.name, result.name);
// result contains additional information about the collection
```
## collection.truncate
`async collection.truncate(): Object`
Deletes **all documents** in the collection in the client.
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collection = client.collection('some-collection');
await collection.truncate();
// the collection "some-collection" is now empty
```
## collection.drop
`async collection.drop([properties]): Object`
Deletes the collection from the client.
**Arguments**
- **properties**: `Object` (optional)
An object with the following properties:
- **isSystem**: `boolean` (Default: `false`)
Whether the collection should be dropped even if it is a system collection.
This parameter must be set to `true` when dropping a system collection.
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collection = client.collection('some-collection');
await collection.drop();
// the collection "some-collection" no longer exists
```
## collection.enableCollectionStream
`async collection.enableCollectionStream(enableStream): Object`
Updates the collection stream flag.
**Arguments**
- **enableStream**: `boolean`
Whether the stream should be enabled on the collection or not.
This parameter must be set to `true` when enabling a stream on the collection.
**NOTE**: You can't set the flag to `false` as on-demand stream deletion is not allowed.
**Examples**
```js
const client = new jsc8({url: "https://gdn1.macrometa.io", token: "XXXX"});
//---- OR ----
const client = new jsc8({url: "https://gdn1.macrometa.io", apiKey: "XXXX"});
const collection = client.collection('some-collection');
await collection.enableCollectionStream(enableStream);
``` | 24.803483 | 167 | 0.700231 | eng_Latn | 0.376951 |
ce6a483e38d412ea4aea9e7b46bc0fbf4622482d | 687 | md | Markdown | README.md | james-alvey-42/ReinforcementLearning | 7658a3e8b77147bf738648575b1d9525d3dbd44c | [
"MIT"
] | null | null | null | README.md | james-alvey-42/ReinforcementLearning | 7658a3e8b77147bf738648575b1d9525d3dbd44c | [
"MIT"
] | null | null | null | README.md | james-alvey-42/ReinforcementLearning | 7658a3e8b77147bf738648575b1d9525d3dbd44c | [
"MIT"
] | null | null | null | # Reinforcement Learning
A repository to learn and practice ideas in reinforcement learning, starting with classical problems such as the multi-armed bandit and moving towards more complex control tasks and risk evaluation. The main reference is [this book](https://web.stanford.edu/class/psych209/Readings/SuttonBartoIPRLBook2ndEd.pdf) by Sutton and Barto.
## Contents
* *Multi-Armed Bandit:* An implementation of the multi-armed bandit problem to illustrate the trade-off between the greedy and exploratory strategies. A summary of the theory and the code can be found [here](https://james-alvey-42.github.io/multiarmedbandit).
<img src="img/multi-armed-bandit.png" height="300"/>
| 68.7 | 332 | 0.799127 | eng_Latn | 0.985609 |
3b7bd7958b9da80383128e00582731cf41827593 | 23,467 | md | Markdown | articles/active-directory/hybrid/tshoot-connect-sync-errors.md | md2perpe/azure-docs.sv-se | 64bdd85952bc1d194f86a3a80e616ca967bb6235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/hybrid/tshoot-connect-sync-errors.md | md2perpe/azure-docs.sv-se | 64bdd85952bc1d194f86a3a80e616ca967bb6235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/hybrid/tshoot-connect-sync-errors.md | md2perpe/azure-docs.sv-se | 64bdd85952bc1d194f86a3a80e616ca967bb6235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Azure AD Connect: Felsök fel under synkronisering | Microsoft Docs'
description: Beskriver hur du felsöker fel påträffades vid synkronisering med Azure AD Connect.
services: active-directory
documentationcenter: ''
author: billmath
manager: daveba
ms.assetid: 2209d5ce-0a64-447b-be3a-6f06d47995f8
ms.service: active-directory
ms.workload: identity
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: article
ms.date: 10/29/2018
ms.subservice: hybrid
ms.author: billmath
ms.collection: M365-identity-device-management
ms.openlocfilehash: f63aebb9a9bbefe84ac36b92cd69e0d93de0ab76
ms.sourcegitcommit: d4dfbc34a1f03488e1b7bc5e711a11b72c717ada
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 06/13/2019
ms.locfileid: "66298761"
---
# <a name="troubleshooting-errors-during-synchronization"></a>Felsök fel under synkronisering
Fel kan uppstå när identitetsdata synkroniseras från Windows Server Active Directory (AD DS) till Azure Active Directory (AD Azure). Den här artikeln innehåller en översikt över olika typer av synkroniseringsfel några möjliga scenarier som orsakar dessa fel och potentiella sätt att åtgärda felen. Den här artikeln innehåller vanliga fel och kan inte omfatta alla eventuella fel.
Den här artikeln förutsätter att läsaren är bekant med den underliggande [designbegrepp för Azure AD och Azure AD Connect](plan-connect-design-concepts.md).
Med den senaste versionen av Azure AD Connect \(augusti 2016 eller högre\), en rapport om synkroniseringsfel är tillgängliga i den [Azure-portalen](https://aka.ms/aadconnecthealth) som en del av Azure AD Connect Health för synkronisering.
Från och med 1 September 2016 [Azure Active Directory duplicerad Attributåterhämtning](how-to-connect-syncservice-duplicate-attribute-resiliency.md) funktionen aktiveras som standard för alla de *nya* Azure Active Directory-klienter. Den här funktionen ska aktiveras automatiskt för befintliga klienter under de kommande månaderna.
Azure AD Connect utför tre typer av åtgärder från kataloger den bevarar synkroniserade: Importera, synkronisering och Export. Fel kan ske i alla åtgärder. Den här artikeln fokuserar främst på fel vid Export till Azure AD.
## <a name="errors-during-export-to-azure-ad"></a>Fel vid Export till Azure AD
Följande avsnitt beskrivs olika typer av synkroniseringsfel som kan uppstå under exportåtgärden till Azure AD med Azure AD-anslutningsappen. Den här anslutningen kan identifieras av namnformat som ”contoso. *onmicrosoft.com*”.
Fel vid Export till Azure AD anger att åtgärden \(lägga till, uppdatera, ta bort osv\) gjordes ett försök av Azure AD Connect \(Synkroniseringsmotorn\) på Azure Active Directory misslyckades.
![Översikt över fel](./media/tshoot-connect-sync-errors/Export_Errors_Overview_01.png)
## <a name="data-mismatch-errors"></a>Matchningsfel datafel
### <a name="invalidsoftmatch"></a>InvalidSoftMatch
#### <a name="description"></a>Beskrivning
* När Azure AD Connect \(Synkroniseringsmotorn\) instruerar Azure Active Directory för att lägga till eller uppdatera objekt, Azure AD matchar en inkommande objekt med hjälp av den **sourceAnchor** attributet den **immutableId** attribut för objekt i Azure AD. Den här matchning kallas en **hårda matchar**.
* När Azure AD **inte hittar** objekt som matchar den **immutableId** attributet med den **sourceAnchor** attribut för det inkommande objektet innan du etablerar en ny objekt att det faller tillbaka om du vill använda ProxyAddresses och UserPrincipalName-attribut för att hitta en matchning. Den här matchning kallas en **mjuk matchar**. Mjuk matchar har utformats för att matcha objekt som redan finns i Azure AD (som kommer i Azure AD) med de nya objekten att lägga till/uppdatera under synkroniseringen som representerar samma entitet (användare, grupper) lokalt.
* **InvalidSoftMatch** felet uppstår när hårda matchar inte hitta något matchande objekt **AND** mjuk matchning hittar en matchande objekt men objektet har ett annat värde för *immutableId* än den inkommande objektets *SourceAnchor*, föreslå att matchande objekt har synkroniserats med ett annat objekt från på lokala Active Directory.
Med andra ord för mjuk matchningen för att fungera, objektet ska vara mjuk matchas med bör inte ha något värde för den *immutableId*. Om något objekt med *immutableId* set med ett värde misslyckas hård-matchningen men som uppfyller kriterierna för ungefärlig matchning åtgärden skulle resultera i en InvalidSoftMatch synkroniseringsfel.
Azure Active Directory-schemat tillåter inte två eller flera objekt som ska ha samma värde för följande attribut. \(Detta är inte en fullständig förteckning.\)
* ProxyAddresses
* UserPrincipalName
* onPremisesSecurityIdentifier
* ObjectId
> [!NOTE]
> [Azure AD-attributet duplicerad Attributåterhämtning](how-to-connect-syncservice-duplicate-attribute-resiliency.md) funktionen också distribueras som standardbeteendet för Azure Active Directory. Detta minskar antalet synkroniseringsfel som setts av Azure AD Connect (samt andra sync-klienter) genom att göra Azure AD mer motståndskraftig i hanteringen av den duplicerade ProxyAddresses och UserPrincipalName-attribut finns i lokala AD-miljö. Den här funktionen löser inte dubblettfel. Därför måste data fortfarande kan åtgärdas. Men den ger etablering av nya objekt som annars blockeras från håller på att etableras på grund av duplicerade värden i Azure AD. Detta minskar också antalet synkroniseringsfel som returneras till klienten synkronisering.
> Om den här funktionen är aktiverad för din klient, visas inte InvalidSoftMatch synkroniseringsfel som visas under etableringen av nya objekt.
>
>
#### <a name="example-scenarios-for-invalidsoftmatch"></a>Exempelscenarier för InvalidSoftMatch
1. Det finns två eller flera objekt med samma värde för attributet ProxyAddresses i den lokala Active Directory. Endast en etableras om i Azure AD.
2. Det finns två eller flera objekt med samma värde för attributet userPrincipalName i den lokala Active Directory. Endast en etableras om i Azure AD.
3. Ett objekt har lagts till i lokalt Active Directory med samma värde för attributet ProxyAddresses som för ett befintligt objekt i Azure Active Directory. Objektet har lagts till i lokala komma inte etableras i Azure Active Directory.
4. Ett objekt har lagts till i lokala Active Directory med samma värdet för userPrincipalName-attribut med ett konto i Azure Active Directory. Objektet komma inte etableras i Azure Active Directory.
5. Ett synkroniserat konto har flyttats från skog A till skog B. Azure AD Connect (Synkroniseringsmotorn) har använt ObjectGUID-attributet för att beräkna SourceAnchor. Värdet för SourceAnchor skiljer sig efter överflyttningen skog. Det nya objektet (från skog B) kan inte synkroniseras med det befintliga objektet i Azure AD.
6. Ett synkroniserat objekt har av misstag tagits bort från lokala Active Directory och ett nytt objekt har skapats i Active Directory för samma entitet (till exempel användare) utan att ta bort kontot i Azure Active Directory. Det nya kontot inte kan synkronisera med det befintliga Azure AD-objektet.
7. Azure AD Connect har avinstallerats och installerats om. Under ominstallationen, har ett annat attribut valts som SourceAnchor. Alla objekt som tidigare har synkroniserats stoppats synkroniseras med InvalidSoftMatch fel.
#### <a name="example-case"></a>Exempel fall:
1. **Bob Smith** är en synkroniserade användare i Azure Active Directory från på lokala Active Directory *contoso.com*
2. Bob Smith **UserPrincipalName** har angetts som **bobs\@contoso.com**.
3. **”abcdefghijklmnopqrstuv ==”** är den **SourceAnchor** beräknas genom att Azure AD Connect med Bob Smith **objectGUID** från lokala Active Directory, vilket är den **immutableId** för Bob Smith i Azure Active Directory.
4. Bob innehåller också följande värden för den **proxyAddresses** attribut:
* smtp: bobs@contoso.com
* smtp: bob.smith@contoso.com
* **SMTP: bob\@contoso.com**
5. En ny användare **Bob Taylor**, har lagts till i lokalerna Active Directory.
6. Bob Taylor **UserPrincipalName** har angetts som **bobt\@contoso.com**.
7. **”abcdefghijkl0123456789 ==” ”** är den **sourceAnchor** beräknas genom att Azure AD Connect med Bob Taylor **objectGUID** från på lokala Active Directory. Bob Taylor objektet har inte synkroniserats till Azure Active Directory ännu.
8. Bob Taylor har följande värden för attributet proxyAddresses
* smtp: bobt@contoso.com
* smtp: bob.taylor@contoso.com
* **SMTP: bob\@contoso.com**
9. Under synkronisering, Azure AD Connect identifierar att lägga till Bob Taylor i lokala Active Directory och be Azure AD för att göra samma ändring.
10. Azure AD först utför hårda matchning. Det vill säga söks om ett objekt med immutableId är lika med ”abcdefghijkl0123456789 ==”. Hårda matchning fungerar inte eftersom inga andra objekt i Azure AD har den immutableId.
11. Azure AD försöker ungefärlig matchning Bob Taylor. Det vill säga söks om ett objekt med proxyAddresses motsvarar värdena, inklusive smtp: bob@contoso.com
12. Azure AD hittar Bob Smith objekt som matchar villkor som ungefärlig matchning. Men det här objektet har värdet för immutableId = ”abcdefghijklmnopqrstuv ==”. vilket betyder att det här objektet har synkroniserats från ett annat objekt från lokala Active Directory. Därför Azure AD kan inte ungefärlig matchning dessa objekt och resulterar i en **InvalidSoftMatch** fel.
#### <a name="how-to-fix-invalidsoftmatch-error"></a>Hur du löser InvalidSoftMatch fel
Den vanligaste orsaken till felet InvalidSoftMatch är två objekt med olika SourceAnchor \(immutableId\) har samma värde för ProxyAddresses och/eller UserPrincipalName-attribut, som används under en ungefärlig matchning bearbeta på Azure AD. För att åtgärda ogiltig mjuk matchning
1. Identifiera duplicerade proxyAddresses, userPrincipalName eller andra attribut-värde som orsakar felet. Även identifiera vilka två \(eller flera\) objekt som är inblandade i konflikten. Rapporten som genereras av [Azure AD Connect Health för synkronisering](https://aka.ms/aadchsyncerrors) kan hjälpa dig identifiera de två objekten.
2. Identifiera vilka objekt ska fortsätta att få det duplicerade värdet och vilket objekt bör inte.
3. Ta bort duplicerade värdet från det objekt som inte ska ha värdet. Du bör göra ändringen i katalogen där objektet kommer från. I vissa fall kan behöva du ta bort ett objekt i konflikt.
4. Om du gjort ändringen i lokalt AD, kan Azure AD Connect synkroniserar ändringen.
Synkronisera felrapporter i Azure AD Connect Health för synkronisering uppdateras var 30: e minut och inkludera fel från det senaste synkroniseringsförsöket.
> [!NOTE]
> ImmutableId, bör per definition inte ändra av objektet. Om Azure AD Connect inte har konfigurerats med några av scenarierna i åtanke från listan ovan, du kan hamna i en situation där Azure AD Connect beräknar ett annat värde för SourceAnchor för AD-objekt som representerar samma entitet (samma användare/grupp / Kontakta osv) som har en befintlig Azure AD-objekt som du vill fortsätta att använda.
>
>
#### <a name="related-articles"></a>Relaterade artiklar
* [Dubblett eller ogiltigt attribut förhindra katalogsynkronisering i Office 365](https://support.microsoft.com/kb/2647098)
### <a name="objecttypemismatch"></a>ObjectTypeMismatch
#### <a name="description"></a>Beskrivning
När Azure AD försöker matcha mjuk två objekt, är det möjligt att två objekt i olika ”objekttyp” (till exempel användare, grupp, kontakt och så vidare) har samma värden för de attribut som används för att utföra mjuk matchningen. Eftersom duplicering av dessa attribut inte tillåts i Azure AD, kan åtgärden resultera i ”ObjectTypeMismatch” synkroniseringsfel.
#### <a name="example-scenarios-for-objecttypemismatch-error"></a>Exempelscenarier för ObjectTypeMismatch fel
* En e-postaktiverad säkerhetsgrupp skapas i Office 365. Administratören lägger till en ny användare eller kontakt i lokalt AD (som inte är synkroniserad med Azure AD ännu) med samma värde för attributet ProxyAddresses som Office 365-grupp.
#### <a name="example-case"></a>Exempel fallet
1. Administratören skapar en ny e-postaktiverad säkerhetsgrupp i Office 365 för Skatteverket och ger en e-postadress som tax@contoso.com. Den här gruppen har tilldelats värdet på attributet ProxyAddresses för **smtp: skatt\@contoso.com**
2. En ny användare ansluter Contoso.com och ett konto har skapats för användaren på plats med proxyAddress som **smtp: skatt\@contoso.com**
3. När Azure AD Connect kommer att synkronisera det nya användarkontot, får den felet ”ObjectTypeMismatch”.
#### <a name="how-to-fix-objecttypemismatch-error"></a>Hur du löser ObjectTypeMismatch fel
Den vanligaste orsaken till felet ObjectTypeMismatch är två objekt av annan typ (användare, grupp, kontakt och så vidare) har samma värde för attributet ProxyAddresses. För att kunna åtgärda ObjectTypeMismatch:
1. Identifiera duplicerade proxyAddresses (eller andra attribut) värde som orsakar felet. Även identifiera vilka två \(eller flera\) objekt som är inblandade i konflikten. Rapporten som genereras av [Azure AD Connect Health för synkronisering](https://aka.ms/aadchsyncerrors) kan hjälpa dig identifiera de två objekten.
2. Identifiera vilka objekt ska fortsätta att få det duplicerade värdet och vilket objekt bör inte.
3. Ta bort duplicerade värdet från det objekt som inte ska ha värdet. Observera att du ska göra ändringen i katalogen där objektet kommer från. I vissa fall kan behöva du ta bort ett objekt i konflikt.
4. Om du gjort ändringen i lokalt AD, kan Azure AD Connect synkroniserar ändringen. Felrapport för synkronisering i Azure AD Connect Health för synkronisering uppdateras var 30: e minut och innehåller fel från det senaste synkroniseringsförsöket.
## <a name="duplicate-attributes"></a>Duplicera attribut
### <a name="attributevaluemustbeunique"></a>AttributeValueMustBeUnique
#### <a name="description"></a>Beskrivning
Azure Active Directory-schemat tillåter inte två eller flera objekt som ska ha samma värde för följande attribut. Det är att varje objekt i Azure AD måste ha ett unikt värde av dessa attribut för en viss instans.
* ProxyAddresses
* UserPrincipalName
Om Azure AD Connect försöker lägga till ett nytt objekt eller uppdatera ett befintligt objekt med ett värde för ovanstående attribut som redan har tilldelats till ett annat objekt i Azure Active Directory, resulterar åtgärden i ”AttributeValueMustBeUnique” sync-fel.
#### <a name="possible-scenarios"></a>Möjliga scenarier:
1. Dubblettvärde har tilldelats ett redan synkroniserade objekt som står i konflikt med ett annat synkroniserat objekt.
#### <a name="example-case"></a>Exempel fall:
1. **Bob Smith** är en synkroniserade användare i Azure Active Directory från på lokala Active Directory contoso.com
2. Bob Smith **UserPrincipalName** lokalt har angetts som **bobs\@contoso.com**.
3. Bob innehåller också följande värden för den **proxyAddresses** attribut:
* smtp: bobs@contoso.com
* smtp: bob.smith@contoso.com
* **SMTP: bob\@contoso.com**
4. En ny användare **Bob Taylor**, har lagts till i lokalerna Active Directory.
5. Bob Taylor **UserPrincipalName** har angetts som **bobt\@contoso.com**.
6. **Bob Taylor** har följande värden för den **ProxyAddresses** attribut i. smtp: bobt@contoso.com ii. smtp: bob.taylor@contoso.com
7. Bob Taylor objekt synkroniseras med Azure AD har.
8. Administratören har valt att uppdatera Bob Taylor **ProxyAddresses** attributet med följande värde: jag. **SMTP: bob\@contoso.com**
9. Azure AD försöker uppdatera Bob Taylor objektet i Azure AD med ovanstående värde, men som misslyckas åtgärden som att ProxyAddresses värdet har redan tilldelats Bob Smith, vilket resulterar i ”AttributeValueMustBeUnique”-fel.
#### <a name="how-to-fix-attributevaluemustbeunique-error"></a>Hur du löser AttributeValueMustBeUnique fel
Den vanligaste orsaken till felet AttributeValueMustBeUnique är två objekt med olika SourceAnchor \(immutableId\) har samma värde för ProxyAddresses och/eller UserPrincipalName-attribut. För att åtgärda felet för AttributeValueMustBeUnique
1. Identifiera duplicerade proxyAddresses, userPrincipalName eller andra attribut-värde som orsakar felet. Även identifiera vilka två \(eller flera\) objekt som är inblandade i konflikten. Rapporten som genereras av [Azure AD Connect Health för synkronisering](https://aka.ms/aadchsyncerrors) kan hjälpa dig identifiera de två objekten.
2. Identifiera vilka objekt ska fortsätta att få det duplicerade värdet och vilket objekt bör inte.
3. Ta bort duplicerade värdet från det objekt som inte ska ha värdet. Observera att du ska göra ändringen i katalogen där objektet kommer från. I vissa fall kan behöva du ta bort ett objekt i konflikt.
4. Om du gjort ändringen i lokalt AD, kan Azure AD Connect synkroniseras ändringen för felet för att hämta har åtgärdats.
#### <a name="related-articles"></a>Relaterade artiklar
-[Dubblett eller ogiltigt attribut förhindra katalogsynkronisering i Office 365](https://support.microsoft.com/kb/2647098)
## <a name="data-validation-failures"></a>Dataverifiering
### <a name="identitydatavalidationfailed"></a>IdentityDataValidationFailed
#### <a name="description"></a>Beskrivning
Azure Active Directory tillämpar olika begränsningar för själva informationen innan dessa data ska skrivas till katalogen. Det här är så att slutanvändarna får bästa möjliga upplevelse när du använder de program som är beroende av den här informationen.
#### <a name="scenarios"></a>Scenarier
a. Värdet för attributet UserPrincipalName har ogiltigt/stöds inte tecken.
b. Attributet UserPrincipalName följer inte formatet som krävs.
#### <a name="how-to-fix-identitydatavalidationfailed-error"></a>Hur du löser IdentityDataValidationFailed fel
a. Se till att attributet userPrincipalName har stöds tecken och format som krävs.
#### <a name="related-articles"></a>Relaterade artiklar
* [Förbereda för att etablera användare via katalogsynkronisering på Office 365](https://support.office.com/article/Prepare-to-provision-users-through-directory-synchronization-to-Office-365-01920974-9e6f-4331-a370-13aea4e82b3e)
### <a name="federateddomainchangeerror"></a>FederatedDomainChangeError
#### <a name="description"></a>Beskrivning
Det här fallet resulterar i en **”FederatedDomainChangeError”** synkronisera fel när du suffix för en användares UserPrincipalName ändras från en federerad domän till en annan federerad domän.
#### <a name="scenarios"></a>Scenarier
För en synkroniserad användare ändrades UserPrincipalName-suffix från en federerad domän till en annan federerad domän lokalt. Till exempel *UserPrincipalName = bob\@contoso.com* har ändrats till *UserPrincipalName = bob\@fabrikam.com*.
#### <a name="example"></a>Exempel
1. Bob Smith, ett konto för Contoso.com, hämtar läggas till som en ny användare i Active Directory med UserPrincipalName bob@contoso.com
2. Bob flyttas till en annan avdelning contoso.com som kallas Fabrikam.com och deras UserPrincipalName ändras till bob@fabrikam.com
3. Både contoso.com och fabrikam.com domäner är federerade domäner med Azure Active Directory.
4. Bobs userPrincipalName uppdateras inte och resulterar i ett ”FederatedDomainChangeError” synkroniseringsfel.
#### <a name="how-to-fix"></a>Hur du åtgärdar
Om en användares UserPrincipalName suffix har uppdaterats från bob @**contoso.com** till bob\@**fabrikam.com**, där båda **contoso.com** och **Fabrikam.com** är **federerade domäner**, Följ stegen nedan för att åtgärda sync-fel
1. Uppdatera användarens UserPrincipalName i Azure AD från bob@contoso.com till bob@contoso.onmicrosoft.com. Du kan använda följande PowerShell-kommando med Azure AD PowerShell-modulen: `Set-MsolUserPrincipalName -UserPrincipalName bob@contoso.com -NewUserPrincipalName bob@contoso.onmicrosoft.com`
2. Tillåt kommer nästa synkroniseringscykel att försöka synkronisering. Den här tidssynkronisering kommer att lyckas och den kommer att uppdatera UserPrincipalName Bob till bob@fabrikam.com som förväntat.
#### <a name="related-articles"></a>Relaterade artiklar
* [Ändringarna synkroniserats inte med Azure Active Directory Sync-verktyget när du har ändrat UPN-namnet för ett användarkonto om du vill använda en annan federerad domän](https://support.microsoft.com/help/2669550/changes-aren-t-synced-by-the-azure-active-directory-sync-tool-after-you-change-the-upn-of-a-user-account-to-use-a-different-federated-domain)
## <a name="largeobject"></a>LargeObject
### <a name="description"></a>Beskrivning
När ett attribut överskrider den tillåtna storleksgränsen, längdbegränsningen eller gränsvärdet som anges av Azure Active Directory-schemat, synkroniseringsåtgärden resulterar i den **LargeObject** eller **ExceededAllowedLength**fel. Det här felet uppstår vanligen för följande attribut
* userCertificate
* userSMIMECertificate
* thumbnailPhoto
* proxyAddresses
### <a name="possible-scenarios"></a>Möjliga scenarier
1. Bobs userCertificate-attributet lagrar för många certifikat som har tilldelats Bob. Dessa kan innehålla äldre, utgångna certifikat. Hård gräns är 15 certifikat. Mer information om hur du hanterar LargeObject-fel med userCertificate-attributet finns i artikeln [hantera LargeObject-fel som orsakats av userCertificate-attributet](tshoot-connect-largeobjecterror-usercertificate.md).
2. Bobs userSMIMECertificate attributet lagrar för många certifikat som har tilldelats Bob. Dessa kan innehålla äldre, utgångna certifikat. Hård gräns är 15 certifikat.
3. Bobs thumbnailPhoto i Active Directory är för stor för att synkroniseras i Azure AD.
4. Ett objekt har för många ProxyAddresses tilldelad vid automatisk ifyllning av attributet ProxyAddresses i Active Directory.
### <a name="how-to-fix"></a>Hur du åtgärdar
1. Kontrollera att attributet som orsakar felet ligger inom den tillåtna begränsningen.
## <a name="existing-admin-role-conflict"></a>Befintliga administratörsroll konflikt
### <a name="description"></a>Beskrivning
En **konflikt med befintliga Admin-rollen** inträffar på ett användarobjekt under synkroniseringen när användarobjektet har:
- administrativ behörighet och
- samma UserPrincipalName som ett befintligt Azure AD-objekt
Azure AD Connect är inte tillåtet att mjuk matchar ett användarobjekt från den lokala AD med ett användarobjekt i Azure AD som har en administrativ roll som tilldelats. Mer information finns i [Azure AD UserPrincipalName population](plan-connect-userprincipalname.md)
![Befintliga Admin](media/tshoot-connect-sync-errors/existingadmin.png)
### <a name="how-to-fix"></a>Hur du åtgärdar
Du löser detta problem genom att göra något av följande:
- Ändra UserPrincipalName till ett värde som inte matchar som en administratör i Azure AD – vilket skapar en ny användare i Azure AD med matchande UserPrincipalName
- ta bort administrativ roll från Admin-användare i Azure AD, vilket gör att mjuk matchning mellan lokala användarobjektet och det befintliga objektet för Azure AD-användare.
>[!NOTE]
>Du kan tilldela administrativa rollen Dete befintliga användarobjekt igen när mjuk matchning mellan lokala användarobjektet och för Azure AD user-objektet har slutförts.
## <a name="related-links"></a>Relaterade länkar
* [Hitta Active Directory-objekt i Active Directory Administrationscenter](https://technet.microsoft.com/library/dd560661.aspx)
* [Hur du frågar Azure Active Directory för ett objekt med Azure Active Directory PowerShell](https://msdn.microsoft.com/library/azure/jj151815.aspx)
| 93.868 | 754 | 0.804321 | swe_Latn | 0.999142 |
3b7c53bcfec0661330cccedf6f0b0e0828e940e6 | 3,230 | md | Markdown | content/publication/knowledgebert/index.md | jivatneet/starter-academic | 2667a77fa442d1a09c2b508d80cbaf7aaa6eb387 | [
"MIT"
] | null | null | null | content/publication/knowledgebert/index.md | jivatneet/starter-academic | 2667a77fa442d1a09c2b508d80cbaf7aaa6eb387 | [
"MIT"
] | null | null | null | content/publication/knowledgebert/index.md | jivatneet/starter-academic | 2667a77fa442d1a09c2b508d80cbaf7aaa6eb387 | [
"MIT"
] | null | null | null | ---
# Documentation: https://wowchemy.com/docs/managing-content/
title: "No Need to Know Everything! Efficiently Augmenting Language Models With External Knowledge"
authors: [admin, "Sumit Bhatia", "Milan Aggarwal", "Rachit Bansal", "Balaji Krishnamurthy"]
date: 2021-09-11T22:43:22+05:30
doi: ""
# Schedule page publish date (NOT publication's date).
publishDate: 2021-03-26T22:43:22+05:30
# Publication type.
# Legend: 0 = Uncategorized; 1 = Conference paper; 2 = Journal article;
# 3 = Preprint / Working Paper; 4 = Report; 5 = Book; 6 = Book section;
# 7 = Thesis; 8 = Patent
publication_types: ["3"]
# Publication name and optional abbreviated publication name.
publication: "*Workshop on Commonsense Reasoning and Knowledge Bases (CSKB) at AKBC 2021*"
publication_short: "*Workshop on Commonsense Reasoning and Knowledge Bases (CSKB) at AKBC*"
abstract: "Large transformer-based pre-trained language models have achieved impressive performance on a variety of knowledge-intensive tasks and can capture semantic, syntactic, and factual knowledge in their parameters. However, storing large amounts of factual knowledge in the parameters of the model is sub-optimal given the resource requirements and ever-growing amounts of knowledge. Instead of packing all the knowledge in the model parameters, we argue that a more efficient alternative is to provide contextually relevant structured knowledge to the model and train it to use that knowledge. This allows the training of the language model to be de-coupled from the external knowledge source and the latter can be updated without affecting the parameters of the language model. Empirical evaluation using different subsets of LAMA probe reveals that such an approach allows smaller language models with access to external knowledge to achieve significant and robust outperformance over much larger language models."
# Summary. An optional shortened abstract.
summary: ""
tags: []
categories: []
featured: false
# Custom links (optional).
# Uncomment and edit lines below to show custom links.
# links:
# - name: Follow
# url: https://twitter.com
# icon_pack: fab
# icon: twitter
url_pdf: 'https://openreview.net/pdf?id=fn5K7VfI3MV'
url_code:
url_dataset:
url_poster:
url_project:
url_slides:
url_source:
url_video:
links:
- url: 'https://akbc-cskb.github.io/videos/17.mp4'
# icon_pack: fab
# icon: twitter
name: Talk
# Featured image
# To use, add an image named `featured.jpg/png` to your page's folder.
# Focal points: Smart, Center, TopLeft, Top, TopRight, Left, Right, BottomLeft, Bottom, BottomRight.
image:
caption: ""
focal_point: ""
preview_only: false
# Associated Projects (optional).
# Associate this publication with one or more of your projects.
# Simply enter your project's folder or file name without extension.
# E.g. `internal-project` references `content/project/internal-project/index.md`.
# Otherwise, set `projects: []`.
projects: []
# Slides (optional).
# Associate this publication with Markdown slides.
# Simply enter your slide deck's filename without extension.
# E.g. `slides: "example"` references `content/slides/example/index.md`.
# Otherwise, set `slides: ""`.
slides: ""
---
| 41.948052 | 1,024 | 0.757585 | eng_Latn | 0.959884 |
3b7c7def664eae86548e13766eaa67e5963a8e55 | 26 | md | Markdown | README.md | Aquaticholic/Luddite-Project-Template | 128d90f6f83f3a2f438b46be659213cc8d682549 | [
"Apache-2.0"
] | null | null | null | README.md | Aquaticholic/Luddite-Project-Template | 128d90f6f83f3a2f438b46be659213cc8d682549 | [
"Apache-2.0"
] | null | null | null | README.md | Aquaticholic/Luddite-Project-Template | 128d90f6f83f3a2f438b46be659213cc8d682549 | [
"Apache-2.0"
] | null | null | null | # Luddite-Project-Template | 26 | 26 | 0.846154 | eng_Latn | 0.329054 |
3b811a0ddf4a7f630f0a6c3fabeee2b7d38ae13f | 362 | md | Markdown | Design Challenges.md | Isaac-Irvine/Text-Cluedo | 46b95993d555533e306a1703c67d426c5ef18045 | [
"Apache-2.0"
] | null | null | null | Design Challenges.md | Isaac-Irvine/Text-Cluedo | 46b95993d555533e306a1703c67d426c5ef18045 | [
"Apache-2.0"
] | null | null | null | Design Challenges.md | Isaac-Irvine/Text-Cluedo | 46b95993d555533e306a1703c67d426c5ef18045 | [
"Apache-2.0"
] | 1 | 2021-09-15T08:11:37.000Z | 2021-09-15T08:11:37.000Z | * The way we implemented the player turns and loading of the game was completely different. (Constructor had to be changed)
* player.turn() had to be recreated. Lots of getters in the Player class had to be added.
* PlayerState had to be created, since player actions are not iterative anymore.
*These are just bullet points which may be useful for the report*
| 60.333333 | 123 | 0.781768 | eng_Latn | 0.999975 |
3b811e50ce633c8a1f65d714c0f831b478b32d3a | 408 | md | Markdown | bing_burp/README.md | tymyrddin/reomais | 818812d3d2f82172f5d3f573846d4d7d27a838c9 | [
"Unlicense"
] | null | null | null | bing_burp/README.md | tymyrddin/reomais | 818812d3d2f82172f5d3f573846d4d7d27a838c9 | [
"Unlicense"
] | null | null | null | bing_burp/README.md | tymyrddin/reomais | 818812d3d2f82172f5d3f573846d4d7d27a838c9 | [
"Unlicense"
] | null | null | null | # Bing for Burp
Microsoft’s Bing search engine has search capabilities that allow querying for all websites it finds on a single IP address using
the “IP” search modifier. Bing will also tell you [all of the subdomains of a given domain](/crawler) if you use the “domain” search modifier.
Scraping Bing for it is not very ethical, but using Bing's API and parsing the results in a script is another matter. | 68 | 142 | 0.784314 | eng_Latn | 0.999668 |
3b812f3351664d534fb966542883f94d5be4eaa4 | 1,701 | md | Markdown | _posts/2003-05-24-more-on-cat-sars-link.md | kerim/keywords | b16bf75bd1e6347b57e3e3ca62012ec198c110f8 | [
"MIT-0",
"MIT"
] | 1 | 2017-08-10T12:37:13.000Z | 2017-08-10T12:37:13.000Z | _posts/2003-05-24-more-on-cat-sars-link.md | kerim/keywords | b16bf75bd1e6347b57e3e3ca62012ec198c110f8 | [
"MIT-0",
"MIT"
] | null | null | null | _posts/2003-05-24-more-on-cat-sars-link.md | kerim/keywords | b16bf75bd1e6347b57e3e3ca62012ec198c110f8 | [
"MIT-0",
"MIT"
] | null | null | null | ---
title: More on Cat-SARS link
author: Kerim
layout: post
permalink: /archives/2003/05/24/more-on-cat-sars-link/
categories:
- Old Blog Import
---
Here is a much better explanation, from the <a href="http://www.washingtonpost.com/wp-dyn/articles/A33143-2003May23.html?nav=hptop_ts" onclick="_gaq.push(['_trackEvent', 'outbound-article', 'http://www.washingtonpost.com/wp-dyn/articles/A33143-2003May23.html?nav=hptop_ts', 'Washington Post']);" >Washington Post</a>, of the research that shows the cat-SARS link:
> The long-sought discovery came when a team of researchers from the University of Hong Kong and the Chinese government tested 25 animals from eight species being sold at a live animal market in the province of Guangdong, where the disease first emerged.
>
>
> The tests found a virus that appeared virtually identical to the SARS virus in saliva and feces of six catlike animals, known as masked palm civets. The researchers directly isolated virus from four of the animals and found pieces of genetic material from the microbe in two others. Tests also showed genetic evidence of the virus in feces of another animal, known as a raccoon dog, and an eighth animal, a Chinese ferret badger, had antibodies to the virus in its blood. None of the animals was sick.
>
>
>
> A detailed genetic analysis of the virus isolated from the animals found it was identical to the SARS virus from human patients except that it lacked one sequence. The missing genetic material carries instructions for the production of a small protein, known as a peptide, and may have been the change that allowed the virus to jump to humans and then spread readily, Stohr said.
>
| 73.956522 | 507 | 0.770723 | eng_Latn | 0.998213 |
3b81b4fadf98800c3059db7bc15e0680f813ce32 | 3,406 | md | Markdown | README.md | ramr/prom-haproxy-exporter | 9d0890af19622dbbf335daaf0b90784ee71518a0 | [
"MIT"
] | 1 | 2020-09-01T20:13:45.000Z | 2020-09-01T20:13:45.000Z | README.md | ramr/openshift-prometheus-haproxy-demo | 9d0890af19622dbbf335daaf0b90784ee71518a0 | [
"MIT"
] | null | null | null | README.md | ramr/openshift-prometheus-haproxy-demo | 9d0890af19622dbbf335daaf0b90784ee71518a0 | [
"MIT"
] | null | null | null | openshift-prometheus-haproxy-demo
=================================
Demo repository to publish the OpenShift HAProxy router statistics using
the prometheus haproxy exporter to prometheus.
Demo
----
1. Start your OpenShift cluster as you would normally. Instructions will
vary depending on how you do this in your environment. Example for a
development environment, you can do this as follows:
export WORKAREA="/home/ramr/workarea";
export GOPATH="${WORKAREA}"
mkdir -p "${WORKAREA}/src/github.com/openshift"
cd "${WORKAREA}/src/github.com/openshift"
git clone https://github.com/openshift/origin.git
cd origin
make # or make release to also build the images
nohup ./_output/local/bin/linux/amd64/openshift start --loglevel=4 &> /tmp/openshift.log &
2. Create a router service account and add it to the privileged SCC.
echo '{ "kind": "ServiceAccount", "apiVersion": "v1", "metadata": { "name": "router" } }' | oc create -f -
Either manually edit the privileged SCC and add the router account.
oc edit scc privileged
# ...
# users:
# - system:serviceaccount:openshift-infra:build-controller
# - system:serviceaccount:default:router
Or you can use jq to script it:
sudo yum install -y jq
oc get scc privileged -o json |
jq '.users |= .+ ["system:serviceaccount:default:router"]' |
oc replace scc -f -
3. Pull down the prometheus and haproxy-exporter images.
docker pull prom/haproxy-exporter
docker pull prom/prometheus
4. Start the router using the router service account we created above and
make sure you expose the haproxy metrics.
oadm router --credentials=$KUBECONFIG --service-account=router \
--replicas=1 --latest-images --expose-metrics
5. As mentioned in the https://github.com/ramr/nodejs-header-echo repo,
create a deployment, service and route.
# Update submodule to the nodejs-header-echo repo and build images.
(cd nodejs-header-echo && git submodule update --init --recursive && make)
# Create deployment + secure/insecure services.
oc create -f nodejs-header-echo/openshift/dc.json
oc create -f nodejs-header-echo/openshift/secure-service.json
oc create -f nodejs-header-echo/openshift/insecure-service.json
# Add a route that allows http and https.
oc create -f nodejs-header-echo/openshift/edge-secured-allow-http-route.json
# check the routes.
oc get routes
6. Wait a bit for the service to become available.
curl -vvv -H "Host: allow-http.header.test" http://127.0.0.1/
7. Run the prometheus server that will scrape the haproxy metrics.
Note: we use port 9999 as some environments have cockpit running on
port 9090 on the host.
make run || echo "see alternative instructions below ..."
echo "Alternatively, you can just start the docker container."
docker run -p 0.0.0.0:9999:9090 -dit ramr/openshift-prometheus-test
8. View the haproxy statistics in the prometheus display at:
http://<node-ipaddr>:9999/consoles/haproxy.html
9. Generate some demo load and you will see the haproxy stats within
prometheus.
ab -c 5 -t 30 -H "Host: allow-http.header.test" http://127.0.0.1/
| 32.75 | 114 | 0.667058 | eng_Latn | 0.824067 |
3b81df107769153540730ab916e7b430d74aa220 | 142 | md | Markdown | README.md | Temmy1/my-gatsby-blog | bf10d9f158e4ded00c6b297d688affacbf2b7377 | [
"MIT"
] | null | null | null | README.md | Temmy1/my-gatsby-blog | bf10d9f158e4ded00c6b297d688affacbf2b7377 | [
"MIT"
] | null | null | null | README.md | Temmy1/my-gatsby-blog | bf10d9f158e4ded00c6b297d688affacbf2b7377 | [
"MIT"
] | null | null | null | ## Моя страница с блогом
Сделано с помощью Gatsby, на основе [gatsby-starter-default](https://github.com/gatsbyjs/gatsby-starter-default).
| 23.666667 | 113 | 0.767606 | dan_Latn | 0.192024 |
3b830632bfccb0520b45e3d84d26096b83782b75 | 5,237 | md | Markdown | README.md | geoprocesamiento-2020-i/geoprocesamiento-2020-i.github.io | e6a4ccf1824b8cee495e709183ac337941380129 | [
"CC-BY-4.0"
] | 3 | 2020-03-15T22:30:48.000Z | 2020-06-10T22:44:41.000Z | README.md | geoprocesamiento-2020-i/geoprocesamiento-2020-i.github.io | e6a4ccf1824b8cee495e709183ac337941380129 | [
"CC-BY-4.0"
] | null | null | null | README.md | geoprocesamiento-2020-i/geoprocesamiento-2020-i.github.io | e6a4ccf1824b8cee495e709183ac337941380129 | [
"CC-BY-4.0"
] | 7 | 2020-04-20T19:12:30.000Z | 2021-11-17T22:18:25.000Z | # GF-0604: Procesamiento de datos geográficos
## Universidad de Costa Rica
### Escuela de Geografía
#### Descripción del curso
Este es un curso introductorio al procesamiento de datos geográficos mediante el lenguaje de programación R. Se estudian los fundamentos de la sintaxis de este lenguaje, sus bibliotecas geoespaciales y sus capacidades para generar gráficos y modelos estadísticos.
#### Programa
* [Programa del curso (modificado el 2020-04-14)](https://github.com/geoprocesamiento-2020i/programa-curso/blob/master/GF-0604-Procesamiento_datos_geograficos-Programa_curso-2020-I-20200414.pdf)
#### Lecciones
* [01 Introducción a la arquitectura de computadoras y a los lenguajes de programación](https://geoprocesamiento-2020i.github.io/leccion-01-introduccion/)
* [02 La sintaxis Markdown](https://geoprocesamiento-2020i.github.io/leccion-02-markdown/)
* [03 La sintaxis R Markdown](https://rmarkdown.rstudio.com/)
* [04 El lenguaje de programación R - Introducción](https://geoprocesamiento-2020i.github.io/leccion-04-r-introduccion/)
* [05 El lenguaje de programación R - Graficación](https://geoprocesamiento-2020i.github.io/leccion-05-r-graficacion/)
* [06 El lenguaje de programación R - Datos vectoriales](https://geoprocesamiento-2020i.github.io/leccion-06-r-datos-vectoriales/)
Lecturas previas:
- [Geocomputation with R - Chapter 1 Introduction](https://geocompr.robinlovelace.net/intro.html)
- [Geocomputation with R - Chapter 2 Geographic data in R](https://geocompr.robinlovelace.net/spatial-class.html)
- (Opcional) [Why R? Webinar 004 - Robin Lovelace + Jakub Nowosad - Recent changes in R spatial](https://www.youtube.com/watch?v=Va0STgco7-4)
* [07 El lenguaje de programación R - Datos vectoriales - operaciones con atributos](https://geoprocesamiento-2020i.github.io/leccion-07-r-datos-vectoriales-atributos/)
Lecturas previas:
- [Geocomputation with R - Chapter 3 Attribute data operations](https://geocompr.robinlovelace.net/attr.html)
* [08 El paquete Leaflet](https://geoprocesamiento-2020i.github.io/leccion-08-leaflet/)
* [09 El lenguaje de programación R - Datos vectoriales - operaciones espaciales](https://geoprocesamiento-2020i.github.io/leccion-09-r-datos-vectoriales-operaciones-espaciales/)
Lecturas previas:
- [Geocomputation with R - Chapter 4 Spatial data operations](https://geocompr.robinlovelace.net/spatial-operations.html)
* [10 El lenguaje de programación R - Datos raster](https://geoprocesamiento-2020i.github.io/leccion-10-r-datos-raster/)
Lecturas previas:
- [Geocomputation with R - Chapter 2 Geographic data in R - Raster data](https://geocompr.robinlovelace.net/spatial-class.html#raster-data)
- [Geocomputation with R - Chapter 3 Attribute data operations - Manipulating raster objects](https://geocompr.robinlovelace.net/attr.html#manipulating-raster-objects)
- [Geocomputation with R - Chapter 4 Spatial data operations - Spatial operations on raster data](https://geocompr.robinlovelace.net/spatial-operations.html#spatial-ras)
#### Laboratorios
* [01 Markdown](https://geoprocesamiento-2020i.github.io/laboratorio-01-markdown/)
* [02 R - Graficación 1](https://geoprocesamiento-2020i.github.io/laboratorio-02-r-graficacion-basica/)
* [03 R - Datos vectoriales 1](https://geoprocesamiento-2020i.github.io/laboratorio-03-r-datos-vectoriales-1/)
* [04 R - Datos raster 1](https://geoprocesamiento-2020i.github.io/laboratorio-04-r-datos-raster-1/)
#### Tareas programadas
* [01 Tablero de control sobre COVID-19 (básico)](https://geoprocesamiento-2020i.github.io/tarea-01-tablero-control-covid19/)
* [02 Tablero de control sobre COVID-19 (avanzado)](https://geoprocesamiento-2020i.github.io/tarea-02-tablero-control-covid19/)
#### Tutoriales
* [Git](https://geoprocesamiento-2020i.github.io/tutorial-git/)
* [flexdashboard](https://geoprocesamiento-2020i.github.io/tutorial-flexdashboard/)
* [rgbif](https://geoprocesamiento-2020i.github.io/tutorial-rgbif/)
#### Otros
* [Datos utilizados durante el curso](https://github.com/geoprocesamiento-2020i/datos)
* [Videos de las lecciones](https://www.youtube.com/playlist?list=PL1gEgLSwAJeLl246l2ArAZUQw3ChvlalH)
Este sitio ha sido construído con las siguientes herramientas y sintaxis:
- [GitHub Pages](https://pages.github.com/)
- [Jekyll](https://jekyllrb.com/)
- [RStudio](https://rstudio.com/)
- [Markdown](https://daringfireball.net/projects/markdown/)
- [R Markdown - ioslides_presentation](https://bookdown.org/yihui/rmarkdown/ioslides-presentation.html)
- [R Markdown - html_document](https://bookdown.org/yihui/rmarkdown/html-document.html)
#### Licencia de uso
<a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Licencia Creative Commons" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a><br /><span xmlns:dct="http://purl.org/dc/terms/" property="dct:title">Los contenidos de este curso</span>, escrito por <a xmlns:cc="http://creativecommons.org/ns#" href="https://github.com/mfvargas" property="cc:attributionName" rel="cc:attributionURL">Manuel Vargas</a>, se comparten mediante una <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Licencia Creative Commons Atribución 4.0 Internacional</a>.
| 78.164179 | 615 | 0.771625 | spa_Latn | 0.475081 |
3b830f5d193c65b9a93ad898b7093c9b5ef0497c | 72 | md | Markdown | README.md | MaikeMota/PI-2017-1 | 9c21cf77952271dc613cb2c5a204f028596c4c44 | [
"MIT"
] | 1 | 2018-04-09T13:45:45.000Z | 2018-04-09T13:45:45.000Z | README.md | MaikeMota/PI-2017-1 | 9c21cf77952271dc613cb2c5a204f028596c4c44 | [
"MIT"
] | null | null | null | README.md | MaikeMota/PI-2017-1 | 9c21cf77952271dc613cb2c5a204f028596c4c44 | [
"MIT"
] | null | null | null | # PI-2017-1
Repositório dos Códigos Fontes do Projeto Integrador 2017-1
| 24 | 59 | 0.805556 | por_Latn | 0.995789 |
3b8426715edd2afc94ea8688b5e20b39c56dbee0 | 407 | md | Markdown | README.md | hopemanryan/datanyze-schematics | 6b73701c26fe3de6e6f54c4947c7bf3069341273 | [
"MIT"
] | null | null | null | README.md | hopemanryan/datanyze-schematics | 6b73701c26fe3de6e6f54c4947c7bf3069341273 | [
"MIT"
] | null | null | null | README.md | hopemanryan/datanyze-schematics | 6b73701c26fe3de6e6f54c4947c7bf3069341273 | [
"MIT"
] | null | null | null | # Datanyze Schemaitcs
##### Component:
```
schenmatics .:comp --path ./
```
##### Service:
```
schenmatics .:service --path ./
```
Both Schematics will ask a small list of questions in order to build you component
This is not as a smart as ng schematics :)
Please fork and update as you wish
tutorial: https://medium.com/@tomastrajan/total-guide-to-custom-angular-schematics-5c50cf90cdb4
| 19.380952 | 95 | 0.692875 | eng_Latn | 0.919014 |
3b84606b0b95a0101810ad61a557e392d15b6d79 | 305 | md | Markdown | README.md | wangdehu/os | 094515f4dbfa51375ff4b508c873ab7b6b8ac1f6 | [
"MIT"
] | 3 | 2018-11-14T16:10:48.000Z | 2021-06-10T12:49:38.000Z | README.md | wangdehu/os | 094515f4dbfa51375ff4b508c873ab7b6b8ac1f6 | [
"MIT"
] | null | null | null | README.md | wangdehu/os | 094515f4dbfa51375ff4b508c873ab7b6b8ac1f6 | [
"MIT"
] | null | null | null | # os
To do it! Do all what I can!
os.md中为完成操作系统的感受与经验
参考资料:
- 《汇编语言》王爽
- 《x86汇编语言:从实模式到保护模式》李忠 王晓波
- 《现代操作系统》AnderwS.Tanenbaum
- 《深入理解计算机系统》布赖恩特(Bryant,R.E.)等
- [uCore OS实验指导书](https://chyyuu.gitbooks.io/ucore_os_docs/)
- [ucore_os_lab项目](https://github.com/chyyuu/ucore_os_lab)
- 《跟我一起写Makefile》陈皓 | 20.333333 | 61 | 0.72459 | yue_Hant | 0.831402 |
3b84edda157b46cb5d4c5cd632e928a77cf64a64 | 804 | md | Markdown | docs/CONTRIBUTING.md | DevTwerx/the-digger | c5c41e20504c0ffa840ba22404300131e3e39a49 | [
"MIT"
] | null | null | null | docs/CONTRIBUTING.md | DevTwerx/the-digger | c5c41e20504c0ffa840ba22404300131e3e39a49 | [
"MIT"
] | 11 | 2020-03-03T00:11:57.000Z | 2020-03-03T22:30:25.000Z | docs/CONTRIBUTING.md | DevTwerx/the-digger | c5c41e20504c0ffa840ba22404300131e3e39a49 | [
"MIT"
] | null | null | null | # Making your contribution!
Hi! This is a super small project, but I assure you, we appreciate your desire to provide a contribution of any sort, even if it's just a bug report!
### Bug reports
If you have a bug, submit it as a Github issue, with as much information as possible. We do have a [code of conduct](https://github.com/DevTwerx/the-digger/blob/master/documentation/CODE_OF_CONDUCT.md) that we would appreciate you abide by.
### Features
If you want to help submit work for an existing work item (we aren't at this point considering random new features from people!) please just submit a pull request against the appropriate feature branch, link the related work item, and provide some description of your code. Someone will be along to review your pull request with all due haste, I'm sure!
| 89.333333 | 353 | 0.781095 | eng_Latn | 0.999228 |
3b855a7cf0376f641368e8d9f9ce5354b6c7fe03 | 3,294 | md | Markdown | investigating-with-grr/flows/starting.md | simstoykov/grr-doc | 94443f0e83ad01a7005e53ab85eab94b34b6aa3b | [
"Apache-2.0"
] | 305 | 2015-01-02T13:57:39.000Z | 2022-02-20T01:49:55.000Z | investigating-with-grr/flows/starting.md | simstoykov/grr-doc | 94443f0e83ad01a7005e53ab85eab94b34b6aa3b | [
"Apache-2.0"
] | 98 | 2015-01-20T14:34:42.000Z | 2021-11-24T22:56:56.000Z | investigating-with-grr/flows/starting.md | simstoykov/grr-doc | 94443f0e83ad01a7005e53ab85eab94b34b6aa3b | [
"Apache-2.0"
] | 202 | 2015-01-20T12:22:31.000Z | 2022-02-22T05:26:54.000Z | # Starting Flows
To start a new Flow simply click on the *Start new flows* option on the
left panel while having a client selected. The main panel will populate with the holy trinity of panels. The tree view shows all the Flows organized by category.
For example, in order to start a *FileFinder* flow,
expand the *FileSystem* category and select the corresponding item.
The flow view will populate with a form with all the user-configurable
parameters for this flow. What’s more, because each parameter has a
well-defined type, GRR shows you widgets to select a value for each
of them.
The FileFinder flow accepts a range parameters:
1. *Paths*. This is a list of textual paths that you want to look at.
2. *Pathtype*. Which VFS handler you want to use for the path.
Available options are:
- **OS**. Uses the OS "open" facility. These are the most
straightforward for a first user. Examples of *os* paths are
`C:/Windows` on Windows or `/etc/init.d/` on Linux/OSX.
- **TSK**. Use Sleuthkit. Because Sleuthkit is invoked a path to
the device is needed along the actual directory path. Examples
of *tsk* paths are
`\\?\Volume{19b4a721-6e90-12d3-fa01-806e6f6e6963}\Windows` for
Windows or `/dev/sda1/init.d/` on Linux (But GRR is smart enough to figure out what you want if you use `C:\Windows` or `/init.d/` instead even though there is some guessing involved).
- **REGISTRY**. Windows-related. You can open the live Windows
registry as if it was a virtual filesystem. So you can specify
a path such as `HKEY_LOCAL_MACHINE/Select/Current`.
- **MEMORY** and **TMPFILE** are internal and should not be used in most cases.
3. *Condition*. The *FileFinder* can filter files based on condition like file size or file contents. The different conditions should be self explanatory. Multiple conditions can be stacked, the file will only be processed if it fulfills them all.
4. *Action*. Once a file passes all the conditions, the action decides what should be done with it. Options are **STAT**, **HASH** and **DOWNLOAD**. Stat basically just indicates if a file exists, this is mostly used to list directories (path `C:\Windows\*` and action STAT). Hash returns a list of hashes of the file and Download collects the file from the client and stores it on the server.
For this example, a good set of arguments would be a directory listing, something like path `C:\Windows\*` or `/tmp/*` and action **STAT**. Once you’ve filled in each required field, click on *Launch* and if all
parameters validated, the Flow will run. Now you can go to the *Manage
launched flows* view to find it running or track it.
> **Important**
> Not all flows might be available on every platform. When trying to run
> a flow that’s not available in the given platform an error will show
> up.
### Available flows ###
The easiest ways to see the current flows is to check in the AdminUI under StartFlow. These have useful documentation.
Note that by default only BASIC flows are shown in the Admin UI. By clicking the settings (gear icon) in the top right, you can enable ADVANCED flows. With this set you will see many of the underlying flows which are sometimes useful, but require a deeper understanding of GRR. | 62.150943 | 393 | 0.739223 | eng_Latn | 0.999383 |
3b860b995bda7af57c091b50e2ec8f8c44692df1 | 2,282 | md | Markdown | README.md | oceanprotocol/github-projects | 2f6cccd351529ff931a4df7b89aa89eeddbec0dc | [
"MIT"
] | 7 | 2018-11-21T21:07:25.000Z | 2019-04-25T21:16:57.000Z | README.md | oceanprotocol/github-projects | 2f6cccd351529ff931a4df7b89aa89eeddbec0dc | [
"MIT"
] | 37 | 2019-04-27T07:13:21.000Z | 2022-03-28T01:16:24.000Z | README.md | oceanprotocol/github-projects | 2f6cccd351529ff931a4df7b89aa89eeddbec0dc | [
"MIT"
] | 2 | 2021-12-31T08:55:01.000Z | 2022-01-10T04:36:23.000Z | [![banner](https://raw.githubusercontent.com/oceanprotocol/art/master/github/repo-banner%402x.png)](https://oceanprotocol.com)
<h1 align="center">github-projects</h1>
> Microservice to cache and expose GitHub projects for use throughout [oceanprotocol.com](https://oceanprotocol.com).
[![Build Status](https://travis-ci.com/oceanprotocol/github-projects.svg?branch=master)](https://travis-ci.com/oceanprotocol/github-projects)
[![js oceanprotocol](https://img.shields.io/badge/js-oceanprotocol-7b1173.svg)](https://github.com/oceanprotocol/eslint-config-oceanprotocol)
[![Greenkeeper badge](https://badges.greenkeeper.io/oceanprotocol/github-projects.svg)](https://greenkeeper.io/)
<img src="http://forthebadge.com/images/badges/powered-by-electricity.svg" height="20"/>
<img src="http://forthebadge.com/images/badges/as-seen-on-tv.svg" height="20"/>
<img src="http://forthebadge.com/images/badges/uses-badges.svg" height="20"/>
## API
Endpoint: [`https://oceanprotocol-github.now.sh`](https://oceanprotocol-github.now.sh)
### GET /
**200**: Returns a list of all public projects as follows
```json
[
{
"name": "project-name",
"description": "The description",
"url": "https://github.com/oceanprotocol/project",
"stars": 3040,
"forks": 293,
"isFork": false,
"isArchived": false,
"topics": [
"oceanprotocol",
"oceanprotocol-driver",
"python"
]
}
]
```
## Development
Install dependencies:
```bash
npm install -g now
npm install
```
And run the server:
```bash
npm start
```
## Test
Run the tests:
```bash
npm test
```
## Deployment
Every branch is automatically deployed to [Now](https://zeit.co/now) with their GitHub integration. A link to a deployment will appear under each Pull Request.
The latest deployment of the `master` branch is automatically aliased to `oceanprotocol-github.now.sh`, configured as `alias` in [`now.json`](now.json).
### Manual Deployment
If needed, app can be deployed manually. Make sure to switch to Ocean Protocol org before deploying:
```bash
# first run
now login
now switch
# deploy
now
# switch alias to new deployment
now alias
```
## Authors
- Matthias Kretschmann ([@kremalicious](https://github.com/kremalicious)) - [Ocean Protocol](https://oceanprotocol.com)
| 25.931818 | 159 | 0.7156 | eng_Latn | 0.498742 |
3b868872d4a1cb328bb6f2fea416cbf5376cac08 | 684 | md | Markdown | README.md | TeckZy/BlockChain-App-UI-Landing-Page | 83afb83ab7b46f94ea81ccfbbdffb8f65865fd7c | [
"MIT"
] | null | null | null | README.md | TeckZy/BlockChain-App-UI-Landing-Page | 83afb83ab7b46f94ea81ccfbbdffb8f65865fd7c | [
"MIT"
] | 5 | 2021-09-02T11:46:45.000Z | 2022-03-02T08:48:35.000Z | README.md | TeckZy/BlockChain-App-UI-Landing-Page | 83afb83ab7b46f94ea81ccfbbdffb8f65865fd7c | [
"MIT"
] | null | null | null | # BlockChain-App-UI-Landing-Page
## Installation
- All the `code` required to get started
### Clone
- Clone this repo to your local machine using `https://github.com/TeckZy/BlockChain-App-UI-Landing-Page.git`
### Setup and Run
> update and install this package first
> now install npm and bower packages
```shell
$ npm install
$ npm run start
```
---
---
## Team
> Or Contributors/People
Manoj
---
## License
[![License](http://img.shields.io/:license-mit-blue.svg?style=flat-square)](http://badges.mit-license.org)
- **[MIT license](http://opensource.org/licenses/mit-license.php)**
- Copyright 2015 © <a href="https://github.com/TeckZy" target="_blank">TecKzy</a>.
| 17.1 | 108 | 0.694444 | eng_Latn | 0.405881 |
3b86ddf752004a200d3efb31c3880f04f8f4c214 | 173 | md | Markdown | proposals/subnet_management/20210805T1240Z.md | egeyar/nns-proposals | 0e4752e7e84ac66739db6bb680f34f335818f93b | [
"Apache-2.0"
] | 50 | 2021-05-22T05:04:41.000Z | 2022-03-09T12:12:24.000Z | proposals/subnet_management/20210805T1240Z.md | egeyar/nns-proposals | 0e4752e7e84ac66739db6bb680f34f335818f93b | [
"Apache-2.0"
] | 7 | 2021-05-23T14:43:14.000Z | 2021-11-12T17:10:11.000Z | proposals/subnet_management/20210805T1240Z.md | egeyar/nns-proposals | 0e4752e7e84ac66739db6bb680f34f335818f93b | [
"Apache-2.0"
] | 31 | 2021-05-21T11:20:51.000Z | 2021-10-20T23:51:17.000Z | Proposing on [mainnet] to upgrade subnet 5 [w4asl-4nmyj-qnr7c-6cqq4-tkwmt-o26di-iupkq-vx4kt-asbrx-jzuxh-4ae] to the replica version c47a773b97f9e45b2760caaee4ad24aa6d5c9b69
| 86.5 | 172 | 0.843931 | eng_Latn | 0.159975 |
3b875451ca18272ae54a09e5f2c070ec202b294d | 393 | md | Markdown | _posts/2021-07-12-Come-fuck-my-pussy-the-way-it-should-be-fucked-20210712222940287928.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-12-Come-fuck-my-pussy-the-way-it-should-be-fucked-20210712222940287928.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-12-Come-fuck-my-pussy-the-way-it-should-be-fucked-20210712222940287928.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | ---
title: "Come fuck my pussy the way it should be fucked."
metadate: "hide"
categories: [ Pussy ]
image: "https://preview.redd.it/rrd4h6wcusa71.jpg?auto=webp&s=4ce6b01f8600c79213bee9e3746af5236b338693"
thumb: "https://preview.redd.it/rrd4h6wcusa71.jpg?width=1080&crop=smart&auto=webp&s=469537982055bbcf6c4d30d978d2693d9231ae46"
visit: ""
---
Come fuck my pussy the way it should be fucked.
| 39.3 | 125 | 0.776081 | yue_Hant | 0.443455 |
3b879a7c0d75dcf8ffbe1fcbb7b9d96c6560cec4 | 771 | md | Markdown | README.md | dvlden/rollup-config | 67ff8507313387fe0376595bc8b45b2b0a9c358c | [
"MIT"
] | 2 | 2019-05-26T17:05:32.000Z | 2020-10-05T15:27:21.000Z | README.md | dvlden/rollup-config | 67ff8507313387fe0376595bc8b45b2b0a9c358c | [
"MIT"
] | 4 | 2021-10-01T17:09:08.000Z | 2021-10-01T17:09:13.000Z | README.md | dvlden/rollup-config | 67ff8507313387fe0376595bc8b45b2b0a9c358c | [
"MIT"
] | null | null | null | # Rollup Config
Config is fully transparent. You don't need to touch it, unless you want
to implement some additional plugins or change output formats.
It is currently configured to output `ES` and `CJS` with type definitions.
For the purpose of module building it is as minimal as possible.
Tests are handled by the Jest testing framework.
## Getting started
- `git clone https://github.com/dvlden/rollup-lib-config.git`
- `npm i`
### Commands
- `npm run test` - run all tests
- `npm run build` - clean output directory and build for production
## Important Notes
If you cloned my repository and want to use this for modules development, before
you push any changes to Git or Npm, make sure to update some important files.
- package.json
- readme
- license
| 24.09375 | 80 | 0.754864 | eng_Latn | 0.998489 |
3b87cb5bc1c08b60ad70346787e1b0b31d2eac6f | 784 | md | Markdown | src/patterns/StateInitializer/README.md | isan26/react-patterns | 8060f3601c380a83569f73cd98a811df8fbff694 | [
"Apache-2.0"
] | 1 | 2022-03-30T15:19:19.000Z | 2022-03-30T15:19:19.000Z | src/patterns/StateInitializer/README.md | isan26/react-patterns | 8060f3601c380a83569f73cd98a811df8fbff694 | [
"Apache-2.0"
] | null | null | null | src/patterns/StateInitializer/README.md | isan26/react-patterns | 8060f3601c380a83569f73cd98a811df8fbff694 | [
"Apache-2.0"
] | null | null | null | # State Initializer
## The problem
How can we setup a component when we use it, and feed it his firsts values? Imagine we have a counter and we need this counter to start not at 0 but at 10, how do we configure a component to take his initial state.
## The solution
The state initializer pattern works by providing the initial state first through the component's props and setting his internal state.
## Consequences
Components with this pattern can be used multiple times in different ways, the can be configure first and them used.
This has to be made carefully because there is the restriction that if the prop used to initiate the component's values changes after the component is render can cause for mistakes because the component was initialized and won't get the change.
| 60.307692 | 244 | 0.795918 | eng_Latn | 0.999995 |
3b88757ce8f21cb4cb22282f8257613f450a4b4c | 1,240 | md | Markdown | _posts/2016-09-07-why-wai-khru.md | neizod/neizod.github.io | 725f1306413c81ce8982091216575c0235cf1c2b | [
"MIT"
] | 6 | 2016-02-28T21:49:50.000Z | 2022-03-17T20:23:49.000Z | _posts/2016-09-07-why-wai-khru.md | neizod/neizod.github.io | 725f1306413c81ce8982091216575c0235cf1c2b | [
"MIT"
] | null | null | null | _posts/2016-09-07-why-wai-khru.md | neizod/neizod.github.io | 725f1306413c81ce8982091216575c0235cf1c2b | [
"MIT"
] | 3 | 2018-10-13T05:45:20.000Z | 2022-01-29T03:38:32.000Z | ---
title: ทำไมต้องไหว้ครู
tags:
- Thought
- Tredition
- Education
date: 2016-09-07 01:55:55 +0700
---
พิธีกรรมอันหนึ่งที่ผมสงสัยมาตั้งแต่เด็กๆ ก็คงหนีไม่พ้นพิธีกรรมในโรงเรียนอย่างการไหว้ครู
ที่จริงเมื่อฟังแนวคิดโดยรวมแล้ว พิธีไหว้ครูนี่เป็นอะไรที่เข้าท่านะ ความรู้ไม่ใช่สิ่งที่ได้มาง่ายๆ นอกจากจะบูชาความรู้แล้ว เราไม่ควรลืมแสดงความเคารพต่อครูบาอาจารย์ที่ถ่ายทอดวิชาความรู้นั้นแก่เราด้วย
แต่ปัญหาที่คาใจมาตลอดคือ ทำไมงานไหว้ครูถึงได้จัดหลังเปิดเทอมไม่กี่วัน/เดือนเองหละ?
ยังไม่ทันสอบกลางภาคเรียนแรกเลยก็ต้องมาจัดพานไหว้ครู ยังไม่ทันได้ความรู้อะไรมากมายแต่กลับต้องมาแสดงออกถึงความนอบน้อมในการเป็นศิษย์เสียแล้ว
ถ้าย้ายงานไหว้ครูไปจัดตอนปลายภาคเรียน มันอาจจะดูเข้าท่ามากกว่านี้ เพราะถือว่าได้รับความรู้เต็มที่แล้วจึงควรกราบไหว้ครูและความรู้นั้นๆ
หรือว่าเราจะเข้าใจผิดไปเอง นึกว่าจะเคารพหลักวิชาความรู้เป็นหลัก แต่แท้จริงแล้วกลับมองเพียงเปลือกที่ตัวบุคคล ต้องหมอบคลานเข้าไปกราบกรานขอให้อาจารย์รับเป็นศิษย์ ไม่เช่นนั้นแล้วก็จะไม่ยอมถ่ายทอดวิชาให้?
พอมองในมุมนี้ ครูที่หวงความรู้จนถึงขั้นต้องให้คนมากราบกรานก่อนถึงจะสอนวิชาความรู้ให้ ก็ดูแล้วไม่น่าเคารพและฝากตัวเป็นศิษย์ด้วยเลยซักนิด
แน่นอนว่าครูก็มีสิทธิ์เลือกศิษย์ และศิษย์ก็ต้องเลือกครู แต่การวางตัวสูงส่งว่ามีความรู้เหนือกว่าแล้วกดหัวคนอื่นลง มันไม่น่าในวิถีปัญญาชนเลยนี่หน่า
| 49.6 | 199 | 0.732258 | tha_Thai | 0.999571 |
3b88f709b424902ee724b7e9bb6444146f06e381 | 2,415 | md | Markdown | meeting-notes/2021/Q1/2021-01-29--sow-gomobile-ipfs.md | LFGaming/community-1 | b6aff6d8cbbc008667789891d24a545eb293ed11 | [
"MIT"
] | 36 | 2020-05-01T09:53:30.000Z | 2022-03-11T16:49:20.000Z | meeting-notes/2021/Q1/2021-01-29--sow-gomobile-ipfs.md | LFGaming/community-1 | b6aff6d8cbbc008667789891d24a545eb293ed11 | [
"MIT"
] | 25 | 2019-08-30T15:30:06.000Z | 2020-04-06T16:55:07.000Z | meeting-notes/2021/Q1/2021-01-29--sow-gomobile-ipfs.md | LFGaming/community-1 | b6aff6d8cbbc008667789891d24a545eb293ed11 | [
"MIT"
] | 7 | 2020-04-30T09:36:25.000Z | 2021-09-16T15:41:03.000Z | # SOW Update - Gomobile IPFS - 29 Jan
This is our fifth follow-up report regarding the gomobile-ipfs SOW.
This report is mainly an outline of our conversation with Dietrich on 27 jan, 2021.
## Updates on Gomobile-IPFS
* We extended gomobile-ipfs configuration, so we can integrate it within Berty Messenger
* We prepared the codebase in order to make it easy to add new drivers over time
* Everything was made in a way that allows various usages:
- **full** gomobile-ipfs: use the library without custom go code
- **hybrid** gomobile-ipfs: extend & make custom configuration of the library with go code
* At the end of the week, Berty Messenger will depends on gomobile-ipfs, we will then start using it on a daily basis
Current architecture detailed here: https://github.com/berty/berty/blob/master/docs/architecture/2020-11-27-adr-gomobile-ipfs.md
Our plan stays the same as before:
1. we experiment drivers, concepts on Berty, and then
2. we can port the stable components on gomobile-ipfs, for everyone.
### Next steps:
- add drivers interface, that will be exposed to both `gomobile-ipfs/bind` package & `berty/bridgeframework` package
- move basic drivers from Berty to gomobile-ipfs
- native-logger driver
- connectivty driver
- discuss with PL team about mobile optimizations
- rendezvous-point: what's the future of https://github.com/libp2p/go-libp2p-rendezvous/pull/1
- recycling connections when switching from a network to another
- long-term roadmap stays the same as before: https://github.com/berty/community/blob/master/meeting-notes/2020/Q4/2020-11-27--sow-gomobile-ipfs.md
## Current Snapshots of the architecture at Berty:
* [Gomobile-IPFS](https://github.com/berty/berty/blob/master/docs/architecture/2020-11-27-adr-gomobile-ipfs.md)
* [GRPC-bridge](https://github.com/berty/berty/blob/master/docs/architecture/2020-11-27-adr-berty-grpc-bridge.txt)
## Read our previous reports:
* [Report #3](https://github.com/berty/community/blob/master/meeting-notes/2020/Q4/2020-11-27--sow-gomobile-ipfs.md)
* [Report #2](https://github.com/berty/community/blob/master/meeting-notes/2020/Q4/2020-11-02--sow-gomobile-ipfs.md)
* [Report #1](https://github.com/berty/community/blob/master/meeting-notes/2020/Q4/2020-10-20--sow-gomobile-ipfs.md)
* [Report #0](https://github.com/berty/community/blob/master/meeting-notes/2020/Q4/2020-10-02--sow-gomobile-ipfs.md)
| 47.352941 | 147 | 0.761077 | eng_Latn | 0.817486 |
3b890721ebfeeddbb7abc7aa772cf3dad13abaec | 821 | markdown | Markdown | _posts/2019-12-17-cancelling-hhvm-4.37.markdown | lexidor/hhvm.com | 90b0d75bea9d302b4b74bec82069f4bf2f87d55d | [
"CC-BY-4.0"
] | 23 | 2016-11-28T02:47:23.000Z | 2021-12-11T00:06:27.000Z | _posts/2019-12-17-cancelling-hhvm-4.37.markdown | lexidor/hhvm.com | 90b0d75bea9d302b4b74bec82069f4bf2f87d55d | [
"CC-BY-4.0"
] | 55 | 2018-07-02T18:38:11.000Z | 2022-03-31T21:25:52.000Z | _posts/2019-12-17-cancelling-hhvm-4.37.markdown | lexidor/hhvm.com | 90b0d75bea9d302b4b74bec82069f4bf2f87d55d | [
"CC-BY-4.0"
] | 28 | 2017-01-20T18:55:37.000Z | 2022-03-02T21:05:20.000Z | ---
title: "Holiday schedule: cancelling HHVM 4.37"
layout: post
author: fred
category: blog
---
We have decided to cancel next week's release of HHVM 4.37, and expect to delay
4.38 until Thursday, the 2nd of January. HHVM 4.31 will be supported until the
release of HHVM 4.38 or above. As HHVM 4.32 has long-term-support (LTS), it is
unaffected by this change.
Our usual schedule is to release on Mondays or Tuesdays, and we expect many
people - both at Facebook, and among our users - to be on vacation for Tuesday
and Wednesday for the next two weeks. We
believe that releasing new versions with breaking changes at this time would not
be in the best interests of our users, especially as we would be less able to
support them during this time.
We look forward to continuing the evolution of Hack and HHVM in 2020.
| 39.095238 | 80 | 0.772229 | eng_Latn | 0.999872 |
3b89e34b02035e4d261d4708f1fb7d93240ed698 | 470 | md | Markdown | _drafts/2010-08-05-post-teaser-image-og-override.md | Cobuilding/cobuilding.ca | e700dfc0c71071c3d192eb86606e40a80f828ea0 | [
"MIT"
] | null | null | null | _drafts/2010-08-05-post-teaser-image-og-override.md | Cobuilding/cobuilding.ca | e700dfc0c71071c3d192eb86606e40a80f828ea0 | [
"MIT"
] | 12 | 2020-12-08T01:07:33.000Z | 2021-03-03T17:02:02.000Z | _drafts/2010-08-05-post-teaser-image-og-override.md | Cobuilding/cobuilding.ca | e700dfc0c71071c3d192eb86606e40a80f828ea0 | [
"MIT"
] | null | null | null | ---
title: 'Post: Teaser Image with OpenGraph Override'
header:
teaser: "/assets/images/page-header-teaser.png"
og_image: "/assets/images/page-header-og-image.png"
categories:
- Layout
- Uncategorized
tags:
- edge case
- image
- layout
last_modified_at: 2017-10-26T15:12:19.000-04:00
---
This post has a teaser image with an OpenGraph override.
```yaml
header:
teaser: /assets/images/page-header-teaser.png
og_image: /assets/images/page-header-og-image.png
``` | 21.363636 | 56 | 0.738298 | eng_Latn | 0.297372 |
3b8ac0926751f64bf8e0452b541e6b2c7897b397 | 1,502 | md | Markdown | projects/qiskit_nature.md | unitaryfund/unitaryhackdev | c7b45e3978ee39164447fa228541ed49508feaef | [
"MIT"
] | 1 | 2022-03-29T13:24:40.000Z | 2022-03-29T13:24:40.000Z | projects/qiskit_nature.md | unitaryfund/unitaryhackdev | c7b45e3978ee39164447fa228541ed49508feaef | [
"MIT"
] | 2 | 2022-03-22T08:17:05.000Z | 2022-03-31T23:26:20.000Z | projects/qiskit_nature.md | unitaryfund/unitaryhackdev | c7b45e3978ee39164447fa228541ed49508feaef | [
"MIT"
] | 2 | 2022-03-14T18:16:27.000Z | 2022-03-28T18:59:16.000Z | ---
title: Qiskit Nature
emoji: 👩🏽🔬
project_url: "https://github.com/Qiskit/qiskit-nature/"
metaDescription: Quantum applications in chemistry, physics, and biology.
date: 2022-04-27
summary: qiskit-nature is a Qiskit application module for chemistry, physics, and biology.
tags:
- python
- quantum-chemistry
- algorithms
bounties:
- name: Implement a succ_full ansatz option
issue_num: 91
value: 200
- name: Improve performance of mode_based_mapping method in QubitMapper
issue_num: 644
value: 50
---
[Qiskit](https://qiskit.org) is an open-source SDK for working with quantum computers at the level of pulses, circuits, and application modules.
The `qiskit-nature` package is a Qiskit application module for chemistry, physics, and biology.
If it is your first time contributing to `qiskit-nature`, please take a look at our [contribution guidelines](https://github.com/Qiskit/qiskit-nature/blob/main/CONTRIBUTING.md).
> If you want to participate with a simpler issue, check out our [list of unassigned `good first issues`](https://github.com/Qiskit/qiskit-nature/issues?q=is%3Aopen+is%3Aissue+no%3Aassignee+label%3A%22good+first+issue%22).
> For support from the community, please [join the Qiskit Slack community](https://ibm.co/joinqiskitslack) and use our [Qiskit Slack channel](https://qiskit.slack.com) for discussion. The channels `#qiskit-dev` and `#qiskit-pr-help` might be a good place for questions on the development and the PRing process respectively.
| 46.9375 | 323 | 0.765646 | eng_Latn | 0.88243 |
3b8af66f27c882ee4348cd4e2130e2833eb3dd26 | 1,631 | md | Markdown | api/README.md | TayaPenskaya/SoftwareTesting | f83b781b285a09625dc421fa30eba7e72d518105 | [
"MIT"
] | null | null | null | api/README.md | TayaPenskaya/SoftwareTesting | f83b781b285a09625dc421fa30eba7e72d518105 | [
"MIT"
] | null | null | null | api/README.md | TayaPenskaya/SoftwareTesting | f83b781b285a09625dc421fa30eba7e72d518105 | [
"MIT"
] | null | null | null | # Express + Node.js + MongoDB
## Methods API
* register user: ```curl -H "Content-Type: application/json" -d '{"username":"name", "password":"pass"}' http://localhost:port/api/users/register```
* login: ```curl -H "Content-Type: application/json" -H "Authorization: Bearer ${TOKEN}" -d '{"username":"name", "password":"pass"}' http://localhost:port/api/users/login```
* create new table: ```curl -H "Content-Type: application/json" -H 'Authorization: Bearer ${TOKEN}' -d '{"seats":10, "rake":5}' http://localhost:port/api/tables```
* get all tables: ```curl -i -H "Accept: application/json" -H "Content-Type: application/json" -H 'Authorization: Bearer ${TOKEN}' -X GET http://localhost:port/api/tables```
* play on table: ```curl -H "Content-Type: application/json" -H 'Authorization: Bearer ${TOKEN}' -d '{"table":{"id":"table_id"}}' http://localhost:port/api/tables/table_id/play ```
* unsubscribe from table: ```curl -X DELETE -H "Content-Type: application/json" -H 'Authorization: Bearer ${TOKEN}' -d '{"table":{"id":"table_id"}}' http://localhost:port/api/tables/table_id/play```
* mongo:
```
> mongo
> show databases
> use poker
> show collections
> show users
> db.users.find()
> db.tables.find()
```
## Полезные ссылки
* [JWT](https://stackabuse.com/authentication-and-authorization-with-jwts-in-express-js/)
* [Lib for JWT in Express](https://github.com/auth0/express-jwt)
* [Mongoose](https://mongoosejs.com/docs/index.html)
* [Passport](https://www.npmjs.com/package/passport)
* [JWT docs](https://www.npmjs.com/package/jsonwebtoken)
* [Article about JWT && NodeJs](https://bezkoder.com/node-js-mongodb-auth-jwt/) | 54.366667 | 198 | 0.687308 | yue_Hant | 0.603483 |
3b8b91a3f30945623cc5a72bf071c9a17feadaa4 | 1,109 | md | Markdown | documentation/services/integrationService.md | PhilipSkinner/elementary | 6b31f282aa85b407569526d679fe6826f68f6d2c | [
"MIT"
] | 54 | 2020-02-17T07:51:21.000Z | 2022-03-31T20:45:47.000Z | documentation/services/integrationService.md | PhilipSkinner/elementary | 6b31f282aa85b407569526d679fe6826f68f6d2c | [
"MIT"
] | 91 | 2020-03-10T21:24:35.000Z | 2021-03-26T22:19:16.000Z | documentation/services/integrationService.md | PhilipSkinner/elementary | 6b31f282aa85b407569526d679fe6826f68f6d2c | [
"MIT"
] | 8 | 2020-09-18T13:37:11.000Z | 2022-02-21T10:26:42.000Z | [Back to Services](/documentation/services)
# Integrations Service
The integrations service allows you to call into any integrations defined within the system. It surfaces the following methods:
* callIntegration
These methods are covered in more detail below.
### callIntegration
Parameters:
* `name` - string, the name of the ruleset to call
* `method` - string, the HTTP method to call the integration using
* `params` - object, the query parameters (or body) to send to the integration endpoint
* `token` - string, the access token to use to access the API *optional*
Calls the named integration with the paramters given and returns the JSON response (as a parsed object) within a promise. This method can reject with an error and needs to be handled correctly.
This can be called from your controllers like so:
```
module.exports = {
events : {
load : (event) => {
return this.integrationService.callIntegration(
"getWeatherReport",
"get",
{
location : "Cullingworth, UK"
}
).then((result) => {
...
}).catch((err) => {
...
});
}
}
};
``` | 26.404762 | 193 | 0.694319 | eng_Latn | 0.992221 |
3b8c1241b3594935d8c8a2e088c7e760bc83ae21 | 68 | md | Markdown | results/metaproteomics/README.md | mkorlevic/Korlevic_SelectiveRemoval_FrontMicrobiol_2021 | b9c06ce5bf45821eae652846c89e73e16d55274d | [
"MIT"
] | null | null | null | results/metaproteomics/README.md | mkorlevic/Korlevic_SelectiveRemoval_FrontMicrobiol_2021 | b9c06ce5bf45821eae652846c89e73e16d55274d | [
"MIT"
] | 1 | 2020-11-16T13:54:32.000Z | 2020-11-16T14:13:51.000Z | results/metaproteomics/README.md | mkorlevic/Korlevic_SelectiveRemoval_FrontMicrobiol_2021 | b9c06ce5bf45821eae652846c89e73e16d55274d | [
"MIT"
] | 2 | 2020-11-17T08:15:07.000Z | 2021-01-18T08:45:20.000Z | This directory contains data produced in the metaprotemic analysis.
| 34 | 67 | 0.852941 | eng_Latn | 0.999571 |
3b8c6db2196049569dd300fe8f03ccce9ab44458 | 1,256 | md | Markdown | posts/kr/intro.md | virgosoy/axios-docs | 4c458519d5cc6f4b8f8684187346f2a970194adb | [
"MIT"
] | 74 | 2020-12-03T13:35:42.000Z | 2022-03-28T13:44:53.000Z | posts/kr/intro.md | virgosoy/axios-docs | 4c458519d5cc6f4b8f8684187346f2a970194adb | [
"MIT"
] | 36 | 2020-06-29T13:56:11.000Z | 2022-03-16T10:58:13.000Z | posts/kr/intro.md | virgosoy/axios-docs | 4c458519d5cc6f4b8f8684187346f2a970194adb | [
"MIT"
] | 63 | 2020-11-20T15:46:59.000Z | 2022-03-23T13:05:12.000Z | ---
title: '시작하기'
description: '브라우저와 node.js에서 사용할 수 있는 Promise 기반 HTTP 클라이언트 라이브러리'
next_title: '기본 예제'
next_link: '/kr/docs/example'
---
# Axios란?
Axios는 node.js와 브라우저를 위한 *[Promise 기반](https://javascript.info/promise-basics)* HTTP 클라이언트 입니다. 그것은 *[동형](https://www.lullabot.com/articles/what-is-an-isomorphic-application)* 입니다(동일한 코드베이스로 브라우저와 node.js에서 실행할 수 있습니다). 서버 사이드에서는 네이티브 node.js의 `http` 모듈을 사용하고, 클라이언트(브라우저)에서는 XMLHttpRequests를 사용합니다.
# 특징
- 브라우저를 위해 [XMLHttpRequests](https://developer.mozilla.org/ko/docs/Web/API/XMLHttpRequest) 생성
- node.js를 위해 [http](http://nodejs.org/api/http.html) 요청 생성
- [Promise](https://developer.mozilla.org/ko/docs/Web/JavaScript/Reference/Global_Objects/Promise) API를 지원
- 요청 및 응답 인터셉트
- 요청 및 응답 데이터 변환
- 요청 취소
- JSON 데이터 자동 변환
- [XSRF](https://ko.wikipedia.org/wiki/%EC%82%AC%EC%9D%B4%ED%8A%B8_%EA%B0%84_%EC%9A%94%EC%B2%AD_%EC%9C%84%EC%A1%B0)를 막기위한 클라이언트 사이드 지원
# 설치
npm 사용하기:
```bash
$ npm install axios
```
bower 사용하기:
```bash
$ bower install axios
```
yarn 사용하기:
```bash
$ yarn add axios
```
jsDelivr CDN 사용하기:
```html
<script src="https://cdn.jsdelivr.net/npm/axios/dist/axios.min.js"></script>
```
unpkg CDN 사용하기:
```html
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
``` | 24.153846 | 299 | 0.703822 | kor_Hang | 0.997437 |
3b8cabf5b0eb9f544e5d06cd92f4e15201c2cbc0 | 598 | md | Markdown | src/main/resources/docs/description/no-proto.md | tobiasweibel/codacy-eslint | 7e61c731de618d26efa89eba0ffe495b56e8fe0d | [
"Apache-2.0"
] | null | null | null | src/main/resources/docs/description/no-proto.md | tobiasweibel/codacy-eslint | 7e61c731de618d26efa89eba0ffe495b56e8fe0d | [
"Apache-2.0"
] | 9 | 2019-12-27T17:33:01.000Z | 2022-03-31T01:04:15.000Z | src/main/resources/docs/description/no-proto.md | tobiasweibel/codacy-eslint | 7e61c731de618d26efa89eba0ffe495b56e8fe0d | [
"Apache-2.0"
] | 4 | 2020-01-29T14:26:30.000Z | 2021-07-27T13:22:02.000Z | Although __proto__ property has been deprecated as of ECMAScript 3.1 and shouldn't be used in the code,
keep in mind that if you need to support legacy browsers, you might want to turn this rule off, since support for getPrototypeOf is not yet universal. When an object is created __proto__ is set to the original prototype property of the object’s constructor function. getPrototypeOf is the preferred method of getting "the prototype".
```
//Bad:
var a = obj.__proto__;
var a = obj["__proto__"];
//Good:
var a = Object.getPrototypeOf(obj);
```
[Source](http://eslint.org/docs/rules/no-proto)
| 46 | 333 | 0.764214 | eng_Latn | 0.993762 |
3b8d382fd7b922de60455a49ee3ee79899baf01e | 895 | md | Markdown | README.md | cedd82/FightSearchV3 | 8a89ddf61064d9c69377e0c0ce1e81621c56f6ce | [
"MIT"
] | null | null | null | README.md | cedd82/FightSearchV3 | 8a89ddf61064d9c69377e0c0ce1e81621c56f6ce | [
"MIT"
] | null | null | null | README.md | cedd82/FightSearchV3 | 8a89ddf61064d9c69377e0c0ce1e81621c56f6ce | [
"MIT"
] | null | null | null | FightSearch is a .net core and angular 7 web application. It serves as a video aggregator for mma bouts for UFC, bellator, pride, strikeforce and WEC. It can be visited at https://www.mmavideosearch.com
Fights can be filtered on a variety of data not available on https://www.ufc.tv and http://www.bellator.com/videos. This is acheived by obtaining data from other sources and matching them to the videos. This is not done as part of this project but another one
The EF query only hits one table as it's constructed when updating the data which is only done every couple of months, so it allows faster performance than joining over 3 tables. Not a huge performance improvement but it was a pragmatic solution for what i needed.
I strive to keep this up to date with the latest technologies. Originally it was written in angularjs and asp.net web api. It is now in angular 7 and .net core 2.1
| 111.875 | 264 | 0.788827 | eng_Latn | 0.999697 |
3b8e104d2d55cd48833d9d818896ff432555919d | 162 | md | Markdown | content/past_project/pipeline.md | aeksco/hugo-scratchpad | 0af0d68a5f9062bfc640a33c032566fcd6e09fd6 | [
"MIT"
] | null | null | null | content/past_project/pipeline.md | aeksco/hugo-scratchpad | 0af0d68a5f9062bfc640a33c032566fcd6e09fd6 | [
"MIT"
] | null | null | null | content/past_project/pipeline.md | aeksco/hugo-scratchpad | 0af0d68a5f9062bfc640a33c032566fcd6e09fd6 | [
"MIT"
] | null | null | null | ---
title: "Pipeline"
github: "pipeline"
website: "https://poly.rpi.edu/pipeline/"
---
Content organization, tracking, and distribution for media organizations.
| 20.25 | 73 | 0.740741 | eng_Latn | 0.676124 |
3b8f29ad42cd4bc3a4238c9f14d803d2be59955a | 451 | md | Markdown | cmd/stitching_sql/README.md | GanLuo96214/Stitching-SQL | 1ceedbb9f8bd7cdef174cd755eff73ea54e37dda | [
"MIT"
] | 1 | 2020-08-06T01:38:20.000Z | 2020-08-06T01:38:20.000Z | cmd/stitching_sql/README.md | ganluo960214/StitchingSQLGo | 1ceedbb9f8bd7cdef174cd755eff73ea54e37dda | [
"MIT"
] | null | null | null | cmd/stitching_sql/README.md | ganluo960214/StitchingSQLGo | 1ceedbb9f8bd7cdef174cd755eff73ea54e37dda | [
"MIT"
] | null | null | null | # Stitching Sql
generate mapper and list from type and type cost
## flags
```
usage: go:generate stitching_sql -type=example -file-name=u_can_set_file_name_or_by_default__-type_stitching_sql.go
-type data type
data type,must be set
-file-name newly generated file name, default as "-type_stitching_sql.go"
file name of the generated file
```
## usage
```shell script
go get -u github.com/ganLuo960214/StitchingSQLGo/cmd/stitching_sql
```
| 22.55 | 115 | 0.767184 | eng_Latn | 0.903453 |
3b9121450ded57e49f6de8bac182dba5f8ca679e | 2,827 | md | Markdown | _posts/2021-04-26-linux-setting-mariaDB.md | kha0213/kha0213.github.io | 56c3c410e4ca4d165349576b7fb3f80e18923d2a | [
"MIT"
] | null | null | null | _posts/2021-04-26-linux-setting-mariaDB.md | kha0213/kha0213.github.io | 56c3c410e4ca4d165349576b7fb3f80e18923d2a | [
"MIT"
] | null | null | null | _posts/2021-04-26-linux-setting-mariaDB.md | kha0213/kha0213.github.io | 56c3c410e4ca4d165349576b7fb3f80e18923d2a | [
"MIT"
] | null | null | null | ---
title: "Linux Create MariaDB Server"
categories:
- linux
tags:
- linux
- mariaDB
- Server
last_modified_at: 2021-04-26T00:40:00-00:00
---
# Linux Quick start
## 1. Install WSL
[wsl설치 : https://docs.microsoft.com/ko-kr/windows/wsl/install-win10](https://docs.microsoft.com/ko-kr/windows/wsl/install-win10)
윈도우 상에서 Linux를 편하게 다루는 WSL이라는 도구가 있다. Microsoft에서 정식으로 지원하는 것이니 이제 윈도우에서도 편하게 Linux를 다룰 수 있다.
우분투를 깔고 MariaDB Server를 설치해보자.
설치 가이드 대로 따라 하면 된다.
## 2. Install Ubuntu
## 3. Install mariaDB
1. sudo apt update && sudo apt-get -y upgrade
apt-get은 우분투(+데미안) 계열의 리눅스에서 쓰이는 패키지 관리 명령어 도구이다.
해당 명령어는 레지파토리에서 업데이트 패키지가 있는지 체크하고 설치된 패키지를 업데이트하는 것이다.
[참고 : https://blog.outsider.ne.kr/346](https://blog.outsider.ne.kr/346)
2. sudo apt-get install -y mariadb-server
자동으로 mariadb 까지 설치가 된다.
접속은 mysql -u root -p 로 확인해 보자.
* mariaDB 10.4버전 부터는 sudo mysql 하여하 한다.
[참조 : https://www.nemonein.xyz/2019/07/2254/](https://www.nemonein.xyz/2019/07/2254/)
```xml
sudo mysql
```
에서 권한이 없어서 아래와 같은 에러가 발생했다.
* ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock'
[참고 : https://jimnong.tistory.com/807](https://jimnong.tistory.com/807)
windows PowerShell 에서
```xml
wsl --list --verbose
```
로 상태 확인하고
```xml
wsl -t ubuntu
```
로 종료하였다. 하지만 역시 인증 되지 않아 다음과 같은 방법을 사용했다.
[참고 : https://thinkpro.tistory.com/16](https://thinkpro.tistory.com/16)
$sudo visudo 에서 권한을 추가하였다.
하지만 역시 실행이 되지 않았다.
```xml
mysql <!--이 명령어도 실행이 되지 않아서-->
sudo service mysql restart <!--서버를 다시 실행시키고-->
sudo mysql -u root -p mysql <!--이걸로 mysql이 들어가졌다.-->
sudo mysql <!--최종 실행 -->
```
😀결과
```markdown
Welcome to the MariaDB monitor. Commands end with ; or \g.
Your MariaDB connection id is 39
Server version: 10.3.25-MariaDB-0ubuntu0.20.04.1 Ubuntu 20.04
Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
```
## 4. mariaDB setting
[참고 : https://jdm.kr/blog/132](https://jdm.kr/blog/132)
```sql
create database db_name;
create user 'username'@'ipaddress' identified by 'password';
grant all privileges on db_name.* to 'username'@'ipaddress';
flush privileges;
-- 확인
select host, user from mysql.user;
```
😀결과
```markdown
+--------------+---------+
| host | user |
+--------------+---------+
| (ip address) | username|
| localhost | root |
+--------------+---------+
```
* 외부에서도 접속 가능하게 변경
root 권한으로 돌아가서 /etc/mysql/mariadb.conf.d/ 폴더에서 50-server.cnf 파일 읽어서
bind-address 부분을 0.0.0.0 으로 변경하면 외부어디에서나 접속 가능하다. (기본은 127.0.0.1)
🐤Tip : ubuntu에서 root 비밀번호 잃어버렸을 때
1. 윈도우에서 cmd 접속
2. 우분투 접속시 root로 접속하도록 설정 변경
* ubuntu.exe config --default-user root
3. 우분투 실행
4. passwd 명령어로 패스워드 변경
5. 다시 윈도우 cmd에서 내 기본계정으로 설정 변경
| 26.420561 | 131 | 0.657588 | kor_Hang | 0.999425 |
3b95376bd88d7c3617ad59ba82b7637d99f0322a | 279 | md | Markdown | _posts/2015-03-10-startup_talk.md | luizcarvalho/luizcarvalho.github.io | 55d451c4d14ddec3d2721d604651ac832b710989 | [
"Apache-2.0"
] | 3 | 2015-08-08T03:52:29.000Z | 2017-04-30T14:18:06.000Z | _posts/2015-03-10-startup_talk.md | luizcarvalho/luizcarvalho.github.io | 55d451c4d14ddec3d2721d604651ac832b710989 | [
"Apache-2.0"
] | 11 | 2015-07-10T18:34:28.000Z | 2022-02-26T03:48:50.000Z | _posts/2015-03-10-startup_talk.md | luizcarvalho/luizcarvalho.github.io | 55d451c4d14ddec3d2721d604651ac832b710989 | [
"Apache-2.0"
] | null | null | null | ---
layout: default
thumb_url: "http://res.cloudinary.com/drlko5ghb/image/upload/v1473634263/r5kcn4bypmnmdmv5p8dw.webp"
modal_id: 10
date: 2015-03-10
title: Startup Talk
subtitle: Evento semanal sobre Startups
prioridade: 8
medium_post_url: https://medium.com/p/131b96795078
---
| 25.363636 | 99 | 0.795699 | kor_Hang | 0.113588 |
3b957689e8517ef6b3fe4b01735a8fe9991903f3 | 15 | md | Markdown | README.md | songhailong/HLEmoji | dd57fcdaef090ba08e66bde4814a71302cad5f9a | [
"Apache-2.0"
] | 1 | 2018-10-16T06:41:10.000Z | 2018-10-16T06:41:10.000Z | README.md | songhailong/HLEmoji | dd57fcdaef090ba08e66bde4814a71302cad5f9a | [
"Apache-2.0"
] | null | null | null | README.md | songhailong/HLEmoji | dd57fcdaef090ba08e66bde4814a71302cad5f9a | [
"Apache-2.0"
] | null | null | null | # HLEmoji
表情键盘
| 5 | 9 | 0.733333 | lit_Latn | 0.974447 |
3b95af4655672c0d44ebcd59e4ffd773b2c660d5 | 502 | md | Markdown | pages/api/heft.heftconfiguration.heftpackagejson.md | elliot-nelson/rushstack.io-website | 073c8c2621055dc26f8aa0baeef294a3ca832030 | [
"CC-BY-4.0",
"MIT"
] | 10 | 2019-11-08T06:57:43.000Z | 2022-02-04T23:30:01.000Z | pages/api/heft.heftconfiguration.heftpackagejson.md | elliot-nelson/rushstack.io-website | 073c8c2621055dc26f8aa0baeef294a3ca832030 | [
"CC-BY-4.0",
"MIT"
] | 11 | 2019-09-05T05:20:57.000Z | 2022-02-26T05:34:54.000Z | pages/api/heft.heftconfiguration.heftpackagejson.md | elliot-nelson/rushstack.io-website | 073c8c2621055dc26f8aa0baeef294a3ca832030 | [
"CC-BY-4.0",
"MIT"
] | 23 | 2019-11-08T06:57:46.000Z | 2022-03-25T15:59:47.000Z | ---
layout: page
navigation_source: api_nav
improve_this_button: false
---
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@rushstack/heft](./heft.md) > [HeftConfiguration](./heft.heftconfiguration.md) > [heftPackageJson](./heft.heftconfiguration.heftpackagejson.md)
## HeftConfiguration.heftPackageJson property
The Heft tool's package.json
<b>Signature:</b>
```typescript
get heftPackageJson(): IPackageJson;
```
| 26.421053 | 175 | 0.715139 | eng_Latn | 0.420043 |
3b96638248d2382558e2328c06b2ca889dc05d1d | 189 | md | Markdown | joystik.md | ismamera/ARDUINO | 5e78d445715ee54da61fb54309ec6c52803259c4 | [
"MIT"
] | null | null | null | joystik.md | ismamera/ARDUINO | 5e78d445715ee54da61fb54309ec6c52803259c4 | [
"MIT"
] | null | null | null | joystik.md | ismamera/ARDUINO | 5e78d445715ee54da61fb54309ec6c52803259c4 | [
"MIT"
] | null | null | null | hoy en classe estamos trabajando el joystikck para lo cual yo echo la conexion de cables macho/hembra
| 2.147727 | 102 | 0.444444 | spa_Latn | 0.999271 |
3b97121bcbf3fe9789dcf44f60a7ffb0216b4614 | 4,232 | md | Markdown | README.md | LIDS-UNICAMP/ODISF | df9710affe22d9f5f1e0632ce1368083fcfbfea1 | [
"MIT"
] | 3 | 2022-01-23T16:50:21.000Z | 2022-02-23T15:30:57.000Z | README.md | LIDS-UNICAMP/ODISF | df9710affe22d9f5f1e0632ce1368083fcfbfea1 | [
"MIT"
] | null | null | null | README.md | LIDS-UNICAMP/ODISF | df9710affe22d9f5f1e0632ce1368083fcfbfea1 | [
"MIT"
] | null | null | null | ## Object-based Dynamic and Iterative Spanning Forest (ODISF)
This is the implementation of the superpixel segmentation method _Object-based Dynamic and Iterative Spanning Forest_(ODISF) as proposed in
- **F.Belém, B.Perret, J.Cousty, S.Guimarães, A.Falcão.** [_Towards a Simple and Efficient Object-based Superpixel Delineation Framework_](https://ieeexplore.ieee.org/document/9643123). In 34th International Conference on Graphics, Patterns and Images (SIBGRAPI), pg 346-353. 2021.
This software includes a program for running the ODISF method, and another one for assisting the visualization of the segmentation by overlaying the superpixel borders. Please cite the aforementioned paper if you use any of this software in your own project.
### Hardware, Setup and Requirements
The project was developed in **C** under a **Linux-based** operational system; therefore, it is **NOT GUARANTEED** to work properly in other systems (_e.g._ Windows and macOS). Moreover, the same applies for **non-GCC** compilers, such as Clang and MinGW.
All code within this project were developed, compiled and tested using the following programs:
- **[GNU GCC](https://gcc.gnu.org/)**: version 7.5.0
- **[GNU Make](https://www.gnu.org/software/make/)**: version 4.1
This code was implemented and evaluated in a computer with the following specifications:
- **Model:** Acer X555LB
- **Operational System:** Linux Mint v20.2 x86_64 kernel version 5.4.0-86-generic
- **Order:** Little-Endian
- **CPU:** 4x Dual-core Intel(R) Core(TM) i5-5200 @ 2.20 GHz
- **Memory:** 8GB RAM ; 480 SSD
The library has in-built support for handling **PNM** images. For enabling external library support, please refer to the [README](externals/README.md) file within the **externals** folder.
### Compiling
If your computer meets the aforementioned requirements, you may run the commands below for compiling the library and all the demonstration programs.
```bash
make lib
make demo
```
Or simply run one of the following commands for compiling both latter at once.
```bash
make
make all
```
For removing the files generated from compilation, one may run the following rule.
```bash
make clean
```
### Running
After compiling, one may run ODISF for segmenting an image through the following command
```bash
./bin/RunODISF --img path/to/image.ppm --objsm path/to/objsm.pgm --out path/to/segm.pgm
```
Briefly, `--img`,`--objsm`, and `--out` indicate the paths to the image to be segmented and its object saliency map, and to the resulting segmentation, respectively. For other arguments, one may run
```bash
./bin/RunODISF --help
```
for more information.
Given such segmentation, it is possible to draw the superpixel borders over the original image for a better visualization by the following command
```bash
./bin/RunOvlayBorders --img path/to/image.ppm --labels path/to/segm.pgm --out path/to/overlay.ppm
```
In this case, `--img`,`--labels`, and `--out` indicate the paths to the original image and segmentation, and to the resulting overlay image, respectively. Likewise, for other arguments, one may run
```bash
./bin/RunOvlayBorders --help
```
for more information.
### License
All codes within this project are under the **MIT License**. See the [LICENSE](LICENSE) file for more details.
### Acknowledgements
This work was financially supported by the following brazilian research funding agencies:
- Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
- Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
- Fundação de Amparo à Pesquisa do Estado de Minas Gerais (FAPEMIG)
- Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
### Contact
If you have any questions or faced an unexpected behavior (_e.g._ bugs), please feel free to contact the authors through the following email addresses:
- **Felipe C. Belém**: [felipe.belem@ic.unicamp.br](mailto:felipe.belem@ic.unicamp.br)
- **Benjamin Perret**: [benjamin.perret@esiee.fr](mailto:benjamin.perret@esiee.fr)
- **Jean Cousty**: [jean.cousty@esiee.fr](mailto:jean.cousty@esiee.fr)
- **Silvio Jamil F. Guimarães**: [sjamil@pucminas.br](mailto:sjamil@pucminas.br)
- **Alexandre X. Falcão**: [afalcao@ic.unicamp.br](mailto:afalcao@ic.unicamp.br)
| 49.788235 | 281 | 0.757089 | eng_Latn | 0.979274 |
3b973d82547e0311060fd975f6de44779c4f77ad | 74 | md | Markdown | README.md | JaimeBarranquero1989/cal_I | 97d6c35908d03528be6f762a06453d4c0b65597d | [
"MIT"
] | null | null | null | README.md | JaimeBarranquero1989/cal_I | 97d6c35908d03528be6f762a06453d4c0b65597d | [
"MIT"
] | null | null | null | README.md | JaimeBarranquero1989/cal_I | 97d6c35908d03528be6f762a06453d4c0b65597d | [
"MIT"
] | null | null | null | # cal_I
Esto es una prueba de la utilizacion de Git por Jaime Barranquero
| 24.666667 | 65 | 0.797297 | spa_Latn | 0.967959 |
3b97a5eb2e40cfacd92ce3e2f2cf0fe990693b93 | 1,659 | md | Markdown | _posts/2018-08-31-web worker学习笔记.md | wwjwentworth/wwjwentworth.github.io | 967e7be72499be1d2ba86259a9566731dbf5f757 | [
"MIT"
] | 2 | 2018-05-10T13:50:59.000Z | 2018-05-10T13:51:03.000Z | _posts/2018-08-31-web worker学习笔记.md | wwjwentworth/wwjwentworth.github.io | 967e7be72499be1d2ba86259a9566731dbf5f757 | [
"MIT"
] | null | null | null | _posts/2018-08-31-web worker学习笔记.md | wwjwentworth/wwjwentworth.github.io | 967e7be72499be1d2ba86259a9566731dbf5f757 | [
"MIT"
] | 1 | 2018-06-22T14:24:18.000Z | 2018-06-22T14:24:18.000Z | ---
layout: post
date: 2018-08-31
author: WWJ
header-img: img/post-bg-universe.jpg
catalog: true
tags: Web Worker
---
# webWorker
> 允许一段javascript代码运行在主线程之外的另一个线程。
**如何创建**
`index.js`
```javascript
const worker = new Worker("./task.js");
```
![enter image description here](https://image.ibb.co/m4qjhp/error.png)
在谷歌浏览器下会报错,原因是谷歌浏览不支持使用本地方式使用Web Worker,简单的方法就是在本地开启一个服务,方法只需全局安装`http-server`,然后在`index.js`对应的文件路径下启动服务
全局安装:npm install http-server -g
启动服务:http-server
`worker`是主线程与其他线程之间的通讯桥梁,主线程和其他线程可以通过
```javascript
onmessage:监听消息
postMessage:发送消息
```
案例如下
```
//主线程 index.js
var worker = new Worker("worker.js");
worker.onmessage = function(event){
// 主线程收到子线程的消息
};
// 主线程向子线程发送消息
worker.postMessage({
type: "start",
value: 12345
});
//web task.js
onmessage = function(event){
// 收到
};
postMessage({
type: "debug",
message: "Starting processing..."
});
```
**如何终止**
如意我们不想再继续运行worker了,那么就可以在主线程中使用`worker.terminate()`或者在相应的其他线程使用`self.close()`
**错误机制**
```javascript
worker.onerror = function (error) {
console.log(error.filename, error.lineno, error.message);
}
// error.filename:出错的脚步名称
// error.lineno:出错的行号
// error.message:错误信息
```
**shared worker**
> web worker只在当前页面下运行,一旦页面关闭就会终止,而shared worker可以同时在多个页面下使用,不会因为关闭了其中一个页面而终止运行
```javascript
const worker = new ShareWorker('./task.js')
```
共享线程也使用onmessge监听事件,使用postMessage发送数据
```javascript
worker.post.onmessage = function(){
// code
}
worker.post.postMessage = function() {
// code
}
```
### web worker与异步的区别
`web worker`是真正意义上的多线程,由`worker`开启的子线程与主线程互不干扰,也互不阻塞,而异步其实还是在主线程上运行,只不过是会先将异步的任务添加到事件队列中,等到主线程上没有任务了才会依次去执行事件队列里面的任务,如果异步任务被阻塞了,主线程也会被阻塞
| 20.231707 | 136 | 0.731163 | yue_Hant | 0.193337 |
3b9917feaf87073dae4ea7829a98d0fa6c512233 | 781 | md | Markdown | README.md | shr4ppy/cassetteWinder | d19fbd2650a19a147d5a0847805790c9a973e9ec | [
"Unlicense"
] | null | null | null | README.md | shr4ppy/cassetteWinder | d19fbd2650a19a147d5a0847805790c9a973e9ec | [
"Unlicense"
] | null | null | null | README.md | shr4ppy/cassetteWinder | d19fbd2650a19a147d5a0847805790c9a973e9ec | [
"Unlicense"
] | null | null | null | # cassetteWinder
## About
This is a utility for my hobby of recording cassettes. Counting the minutes and seconds of songs with a reasonable margin of error as to not make it overrun the length of a cassette is strenuous and irritating. This script simply does the counting quickly, easily, and creates an easy to read tracklist so you can easily fast-forward to whatever song you want.
## Usage
Simply run main.py. On linux, this program will need superuser permissions in order to create files. The loop will run you through the whole process. A spotify account will be needed. Log in at https://developer.spotify.com and create a new application. Thus, write your client ID, client secret and callback id (which ideally is http://localhost:8080/callback) in credentials.py.
| 78.1 | 380 | 0.791293 | eng_Latn | 0.999412 |
3b9991a8c19e346a79c2d4744c43fbb896c87a54 | 1,397 | md | Markdown | README.md | lexbel/git_issue_tracker | 4c7c27ae4a5d3b6c1b31e422ab8a2d3abce56cb0 | [
"MIT"
] | null | null | null | README.md | lexbel/git_issue_tracker | 4c7c27ae4a5d3b6c1b31e422ab8a2d3abce56cb0 | [
"MIT"
] | 1 | 2020-03-26T21:54:15.000Z | 2020-03-26T21:54:15.000Z | README.md | lexbel/git_issue_tracker | 4c7c27ae4a5d3b6c1b31e422ab8a2d3abce56cb0 | [
"MIT"
] | null | null | null | # Git issue tracker
This app track changes in git by receiving web hook from BitBucket and synchronize found issues
with Jira in terms of fix version. So it gives 100% guarantee that fix version is set correctly.
BitBucket or Jira could be changed to any other Git or Bug tracker system (should be written by your own).
## How does it work
1. Webhook (on push) should be set up on the git server.
2. Git itself should be installed where this application is started
3. Extend `IssueHandler` and implement your own workflow with your bug-tracker system.
IE, set **fixVersion** directly to each task or only for story without subtasks or just add
comment where it was merged. For myself I used it together with [automationforjira] for
further no-code processing of received data.
4. Extend `WebHookDataParser` if you use as git server something different from BitBucket.
## Variables to override if needed
- `TRACKED_BRANCH_REGEXP` : which branches do we track, by default it is set to
`(release/.*|hotfix/.*|support/.*|develop|dev)`. Here is used git-flow branch model
- `MERGE_PATTERN_SEARCH_TO_SKIP` : merge pattern which should not be tracked, by default it is set to
`Merge.*((release\/|support\/|hotfix\/)|(tag)).*(develop|dev).*`.
- `WHITE_LISTED_REPOS` : repositories' name which available for further processing, by default
[automationforjira]: https://automationforjira.com/ | 48.172414 | 107 | 0.767359 | eng_Latn | 0.998351 |
3b99977fe529ac87511f56fefbf2a37cb5867ca7 | 962 | md | Markdown | content/post/2013-03-01-a-bit-more-on-the-fire.md | mnsh16/blog | 3d9936f0a406229679be405402c50d267fd166bd | [
"CC-BY-4.0"
] | 6 | 2017-05-01T04:58:04.000Z | 2021-04-06T18:58:30.000Z | content/post/2013-03-01-a-bit-more-on-the-fire.md | mnsh16/blog | 3d9936f0a406229679be405402c50d267fd166bd | [
"CC-BY-4.0"
] | 6 | 2017-04-30T04:49:24.000Z | 2021-06-13T20:17:58.000Z | content/post/2013-03-01-a-bit-more-on-the-fire.md | mnsh16/blog | 3d9936f0a406229679be405402c50d267fd166bd | [
"CC-BY-4.0"
] | 13 | 2017-04-30T05:27:33.000Z | 2021-02-13T09:56:51.000Z | ---
title: A bit more on the fire
author: Karl Broman
date: '2013-03-01'
categories:
- News
tags:
- news
slug: a-bit-more-on-the-fire
---
The [Daily Cardinal](http://host.madison.com/daily-cardinal/) reports that [there were no overhead sprinklers](http://tinyurl.com/cuyhdtl) in the area of the fire yesterday. Fire fighters thought that sprinklers had gone off, but it was really a broken water pipe.
And there are no sprinklers in my office, either. I thought they were required, but I guess only in new construction.
There's a [short TV report](http://www.nbc15.com/home/headlines/Fire-Reported-at-UW-Medical-Science-Building-193836731.html) (after a commercial) at the [channel 15 site](http://www.nbc15.com).
A graduate student interviewed said, "I have multiple copies of my data, but they're all in that building."
I hope we all learn from this: Off-site backups (at least with something like [DropBox](http://www.dropbox.com)) are important.
| 45.809524 | 265 | 0.751559 | eng_Latn | 0.988468 |
3b99a0eccd69acbc9387dfed1535cd4008e5bbd2 | 18,015 | md | Markdown | skype/skype-ps/skype/Remove-CsAnalogDevice.md | v-anpasi/office-powershell-docs | 40f1950d2157f58117f34fe779332a58df0fe6b1 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-05-31T15:24:18.000Z | 2021-05-31T15:24:18.000Z | skype/skype-ps/skype/Remove-CsAnalogDevice.md | v-anpasi/office-powershell-docs | 40f1950d2157f58117f34fe779332a58df0fe6b1 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | skype/skype-ps/skype/Remove-CsAnalogDevice.md | v-anpasi/office-powershell-docs | 40f1950d2157f58117f34fe779332a58df0fe6b1 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
external help file:
applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015
schema: 2.0.0
---
# Remove-CsAnalogDevice
## SYNOPSIS
**Below Content Applies To:** Lync Server 2010
Removes an existing device from the collection of analog devices that can be managed by using Microsoft Lync Server 2010.
An analog device is a telephone or other device that is connected to the public switched telephone network (PSTN).
**Below Content Applies To:** Lync Server 2013
Removes an existing device from the collection of analog devices that can be managed by using Lync Server.
An analog device is a telephone or other device that is connected to the public switched telephone network (PSTN).
This cmdlet was introduced in Lync Server 2010.
**Below Content Applies To:** Skype for Business Server 2015
Removes an existing device from the collection of analog devices that can be managed by using Skype for Business Server 2015.
An analog device is a telephone or other device that is connected to the public switched telephone network (PSTN).
This cmdlet was introduced in Lync Server 2010.
## SYNTAX
```
Remove-CsAnalogDevice [-Identity] <UserIdParameter> [-WhatIf] [-Confirm] [<CommonParameters>]
```
## DESCRIPTION
**Below Content Applies To:** Lync Server 2010
Analog devices include telephones, fax machines, modems, and teletype/telecommunication devices for the deaf (TTY/TDD) devices connected to the public switched telephone network (PSTN).
Unlike devices that take advantage of Enterprise Voice (the Voice over Internet Protocol (VoIP) solution offered by Microsoft), analog devices do not transmit information by using digital packets.
Instead, information is transmitted by using a continuous signal.
This signal is commonly referred to as an analog signal; hence the term "analog devices."
In order to enable administrators to manage analog devices for organizations, Lync Server 2010 lets you associate analog devices with Active Directory contact objects.
After a device has been associated with a contact object, you can then manage the analog device by assigning policies and dial plans to the contact.
Over time, you might need to delete a contact object associated with an analog device.
For example, if you phase out all of your fax machines, you will no longer need to have analog devices (and contact objects) associated with those machines.
The Remove-CsAnalogDevice cmdlet provides a way for you to delete analog devices.
When you run this cmdlet, the device will be deleted from the list of analog devices returned by Get-CsAnalogDevice.
Additionally, the contact object associated with that device will be deleted from Active Directory Domain Services (AD DS).
Who can run this cmdlet: By default, members of the following groups are authorized to run the Remove-CsAnalogDevice cmdlet locally: RTCUniversalUserAdmins.
Permissions to run this cmdlet for specific sites or specific Active Directory organizational units (OUs) can be assigned by using the Grant-CsOUPermission cmdlet.
To return a list of all the role-based access control (RBAC) roles this cmdlet has been assigned to (including any custom RBAC roles you have created yourself), run the following command from the Windows PowerShell prompt:
Get-CsAdminRole | Where-Object {$_.Cmdlets -match "Remove-CsAnalogDevice"}
**Below Content Applies To:** Lync Server 2013
Analog devices include telephones, fax machines, modems, and teletype/telecommunication device for the deaf (TTY/TDD) devices connected to the public switched telephone network (PSTN).
Unlike devices that take advantage of Enterprise Voice (the Voice over Internet Protocol (VoIP) solution offered by Microsoft), analog devices do not transmit information by using digital packets.
Instead, information is transmitted by using a continuous signal.
This signal is commonly referred to as an analog signal; hence the term "analog devices."
In order to enable administrators to manage analog devices for organizations, Lync Server lets you associate analog devices with Active Directory contact objects.
After a device has been associated with a contact object, you can then manage the analog device by assigning policies and dial plans to the contact.
Over time, you might need to delete a contact object associated with an analog device.
For example, if you phase out all of your fax machines, you will no longer need to have analog devices (and contact objects) associated with those machines.
The Remove-CsAnalogDevice cmdlet provides a way for you to delete analog devices.
When you run this cmdlet, the device will be deleted from the list of analog devices returned by Get-CsAnalogDevice.
Additionally, the contact object associated with that device will be deleted from Active Directory Domain Services (AD DS).
Who can run this cmdlet: By default, members of the following groups are authorized to run the Remove-CsAnalogDevice cmdlet locally: RTCUniversalUserAdmins.
Permissions to run this cmdlet for specific sites or specific Active Directory organizational units (OUs) can be assigned by using the Grant-CsOUPermission cmdlet.
To return a list of all the role-based access control (RBAC) roles this cmdlet has been assigned to (including any custom RBAC roles you have created yourself), run the following command from the Windows PowerShell prompt:
Get-CsAdminRole | Where-Object {$_.Cmdlets -match "Remove-CsAnalogDevice"}
**Below Content Applies To:** Skype for Business Server 2015
Analog devices include telephones, fax machines, modems, and teletype/telecommunication device for the deaf (TTY/TDD) devices connected to the public switched telephone network (PSTN).
Unlike devices that take advantage of Enterprise Voice (the Voice over Internet Protocol (VoIP) solution offered by Microsoft), analog devices do not transmit information by using digital packets.
Instead, information is transmitted by using a continuous signal.
This signal is commonly referred to as an analog signal; hence the term "analog devices."
In order to enable administrators to manage analog devices for organizations, Skype for Business Server 2015 lets you associate analog devices with Active Directory contact objects.
After a device has been associated with a contact object, you can then manage the analog device by assigning policies and dial plans to the contact.
Over time, you might need to delete a contact object associated with an analog device.
For example, if you phase out all of your fax machines, you will no longer need to have analog devices (and contact objects) associated with those machines.
The Remove-CsAnalogDevice cmdlet provides a way for you to delete analog devices.
When you run this cmdlet, the device will be deleted from the list of analog devices returned by the Get-CsAnalogDevice cmdlet.
Additionally, the contact object associated with that device will be deleted from Active Directory Domain Services.
## EXAMPLES
### -------------------------- Example 1 ------------------------ (Lync Server 2010)
```
Remove-CsAnalogDevice -Identity "CN={e5e7daba-394e-46ec-95a1-1f2a9947aad2},CN=Users,DC=litwareinc,DC=com"
```
Example 1 deletes the analog device that has the Identity CN={e5e7daba-394e-46ec-95a1-1f2a9947aad2},CN=Users,DC=litwareinc,DC=com.
### -------------------------- EXAMPLE 1 -------------------------- (Lync Server 2013)
```
```
Example 1 deletes the analog device that has the Identity CN={e5e7daba-394e-46ec-95a1-1f2a9947aad2},CN=Users,DC=litwareinc,DC=com.
Remove-CsAnalogDevice -Identity "CN={e5e7daba-394e-46ec-95a1-1f2a9947aad2},CN=Users,DC=litwareinc,DC=com"
### -------------------------- EXAMPLE 1 -------------------------- (Skype for Business Server 2015)
```
```
Example 1 deletes the analog device that has the Identity CN={e5e7daba-394e-46ec-95a1-1f2a9947aad2},CN=Users,DC=litwareinc,DC=com.
Remove-CsAnalogDevice -Identity "CN={e5e7daba-394e-46ec-95a1-1f2a9947aad2},CN=Users,DC=litwareinc,DC=com"
### -------------------------- Example 2 ------------------------ (Lync Server 2010)
```
Get-CsAnalogDevice -Filter {DisplayName -eq "Building 14 Receptionist"} | Remove-CsAnalogDevice
```
The command shown in Example 2 deletes any analog devices that have been assigned the display name "Building 14 Receptionist".
To carry out this task, the command first calls Get-CsAnalogDevice along with the Filter parameter; the filter value {DisplayName -eq "Building 14 Receptionist"} limits the returned objects to analog devices where the DisplayName property is equal to "Building 14 Receptionist".
The returned items are then piped to, and removed by, Remove-CsAnalogDevice.
### -------------------------- EXAMPLE 2 -------------------------- (Lync Server 2013)
```
```
The command shown in Example 2 deletes any analog devices that have been assigned the display name "Building 14 Receptionist".
To carry out this task, the command first calls Get-CsAnalogDevice along with the Filter parameter; the filter value {DisplayName -eq "Building 14 Receptionist"} limits the returned objects to analog devices where the DisplayName property is equal to "Building 14 Receptionist".
The returned items are then piped to, and removed by, Remove-CsAnalogDevice.
Get-CsAnalogDevice -Filter {DisplayName -eq "Building 14 Receptionist"} | Remove-CsAnalogDevice
### -------------------------- EXAMPLE 2 -------------------------- (Skype for Business Server 2015)
```
```
The command shown in Example 2 deletes any analog devices that have been assigned the display name "Building 14 Receptionist".
To carry out this task, the command first calls the Get-CsAnalogDevice cmdlet along with the Filter parameter; the filter value {DisplayName -eq "Building 14 Receptionist"} limits the returned objects to analog devices where the DisplayName property is equal to "Building 14 Receptionist".
The returned items are then piped to, and removed by, the Remove-CsAnalogDevice cmdlet.
Get-CsAnalogDevice -Filter {DisplayName -eq "Building 14 Receptionist"} | Remove-CsAnalogDevice
### -------------------------- Example 3 ------------------------ (Lync Server 2010)
```
Get-CsAnalogDevice -Filter {VoicePolicy -eq "RedmondVoicePolicy"} | Remove-CsAnalogDevice
```
The preceding command deletes all of the analog devices that have been assigned the voice policy RedmondVoicePolicy.
To do this, Get-CsAnalogDevice and the Filter parameter are used to retrieve all of the analog devices where the VoicePolicy property is equal to RedmondVoicePolicy.
The filtered collection is then piped to the Remove-CsAnalogDevice cmdlet, which deletes each item in that collection.
### -------------------------- EXAMPLE 3 -------------------------- (Lync Server 2013)
```
```
Example 3 deletes all of the analog devices that have been assigned the voice policy RedmondVoicePolicy.
To do this, Get-CsAnalogDevice and the Filter parameter are used to retrieve all of the analog devices where the VoicePolicy property is equal to RedmondVoicePolicy.
The filtered collection is then piped to the Remove-CsAnalogDevice cmdlet, which deletes each item in that collection.
Get-CsAnalogDevice -Filter {VoicePolicy -eq "RedmondVoicePolicy"} | Remove-CsAnalogDevice
### -------------------------- EXAMPLE 3 -------------------------- (Skype for Business Server 2015)
```
```
Example 3 deletes all of the analog devices that have been assigned the voice policy RedmondVoicePolicy.
To do this, the Get-CsAnalogDevice cmdlet and the Filter parameter are used to retrieve all of the analog devices where the VoicePolicy property is equal to RedmondVoicePolicy.
The filtered collection is then piped to the Remove-CsAnalogDevice cmdlet, which deletes each item in that collection.
Get-CsAnalogDevice -Filter {VoicePolicy -eq "RedmondVoicePolicy"} | Remove-CsAnalogDevice
### -------------------------- Example 4 ------------------------ (Lync Server 2010)
```
Get-CsAnalogDevice -Filter {AnalogFax -eq $True} | Remove-CsAnalogDevice
```
The command shown in Example 4 removes all the analog fax machines currently in use in the organization.
To carry out this task, Get-CsAnalogDevice is called first along with the Filter parameter; the filter value {AnalogFax -eq $True} picks out only those devices where the AnalogFax property is equal to True.
In turn, this filtered collection is piped to Remove-CsAnalogDevice, which removes each item in the collection.
### -------------------------- EXAMPLE 4 -------------------------- (Lync Server 2013)
```
```
The command shown in Example 4 removes all the analog fax machines currently in use in the organization.
To carry out this task, Get-CsAnalogDevice is called first along with the Filter parameter; the filter value {AnalogFax -eq $True} picks out only those devices where the AnalogFax property is equal to True.
In turn, this filtered collection is piped to Remove-CsAnalogDevice, which removes each item in the collection.
Get-CsAnalogDevice -Filter {AnalogFax -eq $True} | Remove-CsAnalogDevice
### -------------------------- EXAMPLE 4 -------------------------- (Skype for Business Server 2015)
```
```
The command shown in Example 4 removes all the analog fax machines currently in use in the organization.
To carry out this task, the Get-CsAnalogDevice cmdlet is called first along with the Filter parameter; the filter value {AnalogFax -eq $True} picks out only those devices where the AnalogFax property is equal to True.
In turn, this filtered collection is piped to the Remove-CsAnalogDevice cmdlet, which removes each item in the collection.
Get-CsAnalogDevice -Filter {AnalogFax -eq $True} | Remove-CsAnalogDevice
## PARAMETERS
### -Identity
**Below Content Applies To:** Lync Server 2010
Unique identifier for the analog device to be removed.
Analog devices are identified by using the Active Directory distinguished name (DN) of the associated contact object.
By default, these devices, use a GUID (globally unique identifier) as their common name; that means analog devices will typically have an Identity similar to this: CN={ce84964a-c4da-4622-ad34-c54ff3ed361f},OU=Redmond,DC=Litwareinc,DC=com.
Because of that you might find it easier to retrieve analog devices by using the Get-CsAnalogDevice cmdlet, and then piping the returned objects to Remove-CsAnalogDevice.
**Below Content Applies To:** Lync Server 2013
Unique identifier for the analog device to be removed.
Analog devices are identified by using the Active Directory distinguished name (DN) of the associated contact object.
By default, these devices, use a globally unique identifier (GUID) as their common name; that means analog devices will typically have an Identity similar to this: CN={ce84964a-c4da-4622-ad34-c54ff3ed361f},OU=Redmond,DC=Litwareinc,DC=com.
Because of that you might find it easier to retrieve analog devices by using the Get-CsAnalogDevice cmdlet, and then piping the returned objects to Remove-CsAnalogDevice.
**Below Content Applies To:** Skype for Business Server 2015
Unique identifier for the analog device to be removed.
Analog devices are identified by using the Active Directory distinguished name (DN) of the associated contact object.
By default, these devices, use a globally unique identifier (GUID) as their common name; that means analog devices will typically have an Identity similar to this: CN={ce84964a-c4da-4622-ad34-c54ff3ed361f},OU=Redmond,DC=Litwareinc,DC=com.
Because of that you might find it easier to retrieve analog devices by using the Get-CsAnalogDevice cmdlet, and then piping the returned objects to the Remove-CsAnalogDevice cmdlet.
```yaml
Type: UserIdParameter
Parameter Sets: (All)
Aliases:
Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015
Required: True
Position: 1
Default value: None
Accept pipeline input: True (ByPropertyName, ByValue)
Accept wildcard characters: False
```
### -WhatIf
Describes what would happen if you executed the command without actually executing the command.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: wi
Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Confirm
Prompts you for confirmation before executing the command.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: cf
Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see about_CommonParameters (http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
###
Microsoft.Rtc.Management.ADConnect.Schema.OCSADAnalogDeviceContact object.
Remove-CsAnalogDevice accepts pipelined instances of the analog device object.
###
Microsoft.Rtc.Management.ADConnect.Schema.OCSADAnalogDeviceContact object.
The Remove-CsAnalogDevice cmdlet accepts pipelined instances of the analog device object.
## OUTPUTS
###
Remove-CsAnalogDevice deletes existing instances of the Microsoft.Rtc.Management.ADConnect.Schema.OCSADAnalogDeviceContact object.
###
The Remove-CsAnalogDevice cmdlet deletes existing instances of the Microsoft.Rtc.Management.ADConnect.Schema.OCSADAnalogDeviceContact object.
## NOTES
## RELATED LINKS
[Online Version](http://technet.microsoft.com/EN-US/library/61250894-fde6-476d-aaa2-ec5692af02b3(OCS.14).aspx)
[Get-CsAnalogDevice]()
[Move-CsAnalogDevice]()
[New-CsAnalogDevice]()
[Set-CsAnalogDevice]()
[Online Version](http://technet.microsoft.com/EN-US/library/61250894-fde6-476d-aaa2-ec5692af02b3(OCS.15).aspx)
[Online Version](http://technet.microsoft.com/EN-US/library/61250894-fde6-476d-aaa2-ec5692af02b3(OCS.16).aspx)
| 54.262048 | 314 | 0.770802 | eng_Latn | 0.988202 |
3b9a49e777566be27c0e47c880f200734ed177ee | 460 | md | Markdown | README.md | BioMaRu/indiedish-test-react | 4f80fc4cc86724f938afbc6db66bc6746798a9a1 | [
"MIT"
] | null | null | null | README.md | BioMaRu/indiedish-test-react | 4f80fc4cc86724f938afbc6db66bc6746798a9a1 | [
"MIT"
] | null | null | null | README.md | BioMaRu/indiedish-test-react | 4f80fc4cc86724f938afbc6db66bc6746798a9a1 | [
"MIT"
] | null | null | null | # React Version
Pure React using ES6(ECMA2015) syntax, no jquery, no bootstrap, no other library
## How to run
1. **Install [Node 4.0.0 or greater](https://nodejs.org)** - (5.0 or greater is recommended for optimal build performance).
2. **Clone the project**. `git clone https://github.com/BioMaRu/indiedish-test-react.git`.
3. **Go to project directory**. `cd indiedish-test-react`
4. **Install Dependencies**. `npm install`
5. **Run the app**. `npm start`
| 46 | 123 | 0.71087 | eng_Latn | 0.769918 |
3b9adbfa47e241bafdfe12cf9b8d1ad31e94c532 | 170 | md | Markdown | docs/description/E2522.md | h314to/codacy-cfn-lint-old | 863a22f22c6d2d52c5d920e10b9a3eb25511d650 | [
"Apache-2.0"
] | null | null | null | docs/description/E2522.md | h314to/codacy-cfn-lint-old | 863a22f22c6d2d52c5d920e10b9a3eb25511d650 | [
"Apache-2.0"
] | null | null | null | docs/description/E2522.md | h314to/codacy-cfn-lint-old | 863a22f22c6d2d52c5d920e10b9a3eb25511d650 | [
"Apache-2.0"
] | null | null | null | Making sure CloudFormation properties that require at least one property from a list. More than one can be included.
[SOURCE](https://github.com/awslabs/cfn-python-lint) | 56.666667 | 116 | 0.8 | eng_Latn | 0.99339 |
3b9b16d5502d2ac275be685ae50435b89cf196c9 | 431 | md | Markdown | README.md | ironsideshu/SnowplowDashDBConn | cbffdf56d132f01083f667b5e7bda3d12a1577f1 | [
"Apache-2.0"
] | null | null | null | README.md | ironsideshu/SnowplowDashDBConn | cbffdf56d132f01083f667b5e7bda3d12a1577f1 | [
"Apache-2.0"
] | null | null | null | README.md | ironsideshu/SnowplowDashDBConn | cbffdf56d132f01083f667b5e7bda3d12a1577f1 | [
"Apache-2.0"
] | null | null | null | ## WIP
Custom jar for pushing shredded Snowplow part files in a HDFS into DashDB tables. Current iteration only facilitate movement of atomic-events part files into a DashDB 'events' table.
Build with Maven (mvn install)
Mandatory jar arguments:
\-\-dbhost \<DashDB Host (no port)\>
\-\-dbtable \<DashDB Table\>
\-\-dbuser \<DashDB Username\>
\-\-dbpassword \<DashDB Password\>
\-\-hdfspath \<Part Files folder in HDFS\>
| 21.55 | 182 | 0.721578 | eng_Latn | 0.486619 |
3b9bcd6c5f3482271f53031263f4df2a11cff6ab | 313 | md | Markdown | doc/compiler.md | JeffTheK/ubiquit | d3f9c3780d73613d796bf7a30b9e96ebd9465730 | [
"MIT"
] | 2 | 2021-06-27T07:29:09.000Z | 2022-03-09T12:33:48.000Z | doc/compiler.md | JeffTheK/ubiquit | d3f9c3780d73613d796bf7a30b9e96ebd9465730 | [
"MIT"
] | 10 | 2021-06-26T13:43:34.000Z | 2022-03-09T12:34:55.000Z | doc/compiler.md | JeffTheK/ubiquit | d3f9c3780d73613d796bf7a30b9e96ebd9465730 | [
"MIT"
] | 3 | 2021-06-26T13:34:18.000Z | 2022-03-09T11:40:42.000Z | # Compiler
## Usage
Invoke with `ubc` (short for ubiquit compiler).
## Arguments
* `-no-warnings` - disables all warnings
* `-debug` - adds extra code for debug, doesn't optimize the code
* `-release` - removes debug code, optimizes as much as possible
## Example
````shell
ubc src/main.ub src/foo.ub
````
| 16.473684 | 65 | 0.683706 | eng_Latn | 0.944362 |
3b9d8ce8aa7d067d121553a9cc3b796feb59249f | 8,510 | md | Markdown | README.md | wojciech-zurek/kotlin-spring-boot-prometheus-grafana-example | fd15ff1987a61f572ba57b59e797fd0824be6737 | [
"MIT"
] | 18 | 2018-04-15T02:57:22.000Z | 2022-01-27T11:03:39.000Z | README.md | trongnghia203/kotlin-spring-boot-prometheus-grafana-example | fd15ff1987a61f572ba57b59e797fd0824be6737 | [
"MIT"
] | null | null | null | README.md | trongnghia203/kotlin-spring-boot-prometheus-grafana-example | fd15ff1987a61f572ba57b59e797fd0824be6737 | [
"MIT"
] | 9 | 2018-09-28T17:10:46.000Z | 2020-07-20T16:53:40.000Z | # Spring Boot 2.0 metrics with Actuator, Prometheus and Grafana example
![dashboard](readme/dashboard.png "Dashboard")
## Intro:
- [Spring Boot Actuator](https://docs.spring.io/spring-boot/docs/current/reference/html/production-ready.html)
- [Prometheus](https://prometheus.io/)
- [Grafana](https://grafana.com/)
## How to...
implement metrics in Spring Boot and export to Prometheus and Grafana?
#### 1. Create new Spring Boot application
or use exist one
#### 2. Add dependencies to your project
```gradle
compile('org.springframework.boot:spring-boot-starter-actuator')
runtime("io.micrometer:micrometer-registry-prometheus")
```
#### 3. Expose metrics and prometheus endpoints in application.properties
```
management.endpoints.web.exposure.include=health,info,metrics,prometheus
```
#### 4. Use prepared metrics by Spring creators or create custom metrics for your app
Inject MeterRegistry, create and manage your application set of meters.
Example:
```kotlin
@Autowired
fun setCounter(meterRegistry: MeterRegistry) {
//counters -> increment value
messageCounter = meterRegistry.counter("service.message.counter")
operationCounter = meterRegistry.counter("service.message.long.operation.counter")
//gauges -> shows the current value of a meter.
lastMessageLength = meterRegistry.gauge("service.message.last.message.length", AtomicInteger())!!
//shows collection size (queue message, cache size etc...). In real app the collection implementation used should be thread safe.
messages = meterRegistry.gaugeCollectionSize("service.message.message.size", emptyList(), mutableListOf())!!
//timer -> measures the time taken for short tasks and the count of these tasks.
timer = meterRegistry.timer("service.message.long.operation.run.timer")
//other meters...
}
```
#### 5. Run you application
For example:
```bash
./gradlew bootRun
```
#### 6. Check your working metrics (JSON format)
```bash
curl http://localhost:8080/actuator/metrics
{"names":["jvm.buffer.memory.used","jvm.memory.used","jvm.gc.memory.allocated","jvm.memory.committed","tomcat.sessions.created","tomcat.sessions.expired","tomcat.global.request.max","tomcat.global.error","jvm.gc.max.data.size","service.hello.operation.run.timer","service.message.operation.counter","logback.events","system.cpu.count","jvm.memory.max","jvm.buffer.total.capacity","jvm.buffer.count","process.files.max","jvm.threads.daemon","process.start.time","service.message.counter","tomcat.global.sent","tomcat.sessions.active.max","tomcat.threads.config.max","service.message.last.length","jvm.gc.live.data.size","process.files.open","process.cpu.usage","service.message.message.size","tomcat.servlet.request","process.uptime","tomcat.global.received","system.load.average.1m","tomcat.cache.hit","http.server.requests","jvm.gc.pause","tomcat.servlet.error","tomcat.servlet.request.max","tomcat.cache.access","tomcat.threads.busy","tomcat.sessions.active.current","system.cpu.usage","jvm.threads.live","jvm.classes.loaded","jvm.classes.unloaded","jvm.threads.peak","tomcat.threads.current","tomcat.global.request","jvm.gc.memory.promoted","tomcat.sessions.rejected","tomcat.sessions.alive.max"]}%
```
#### 7. Check your working metrics (Prometheus format)
```bash
curl http://localhost:8080/actuator/prometheus
# HELP jvm_gc_memory_promoted_bytes_total Count of positive increases in the size of the old generation memory pool before GC to after GC
# TYPE jvm_gc_memory_promoted_bytes_total counter
jvm_gc_memory_promoted_bytes_total 24576.0
...
```
#### 8. Check your custom metrics (Prometheus format)
```bash
curl -s http://localhost:8080/actuator/prometheus |grep service_message
# HELP service_message_last_message_length
# TYPE service_message_last_message_length gauge
service_message_last_message_length 18.0
# HELP service_message_message_size
# TYPE service_message_message_size gauge
service_message_message_size 4.0
# HELP service_message_long_operation_run_timer_seconds
# TYPE service_message_long_operation_run_timer_seconds summary
service_message_long_operation_run_timer_seconds_count 74.0
service_message_long_operation_run_timer_seconds_sum 265.597506794
service_message_long_operation_run_timer_seconds_max 6.9341843
...
```
#### 9. Add Spring Boot app scrape config to prometheus.yml
[More info](https://prometheus.io/docs/prometheus/latest/configuration/configuration/)
```
- job_name: 'spring-boot-example-metric'
# Override the global default and scrape targets from this job every 5 seconds.
scrape_interval: 5s
metrics_path: '/actuator/prometheus' # path to spring boot metrics
static_configs:
- targets: ['spring-boot-example-metric:8080'] # host and port
```
#### 10. Open Prometheus dashboard
and check Spring Boot endpoint status (menu: Status/Targets)
![status](readme/prometheus-1.png "Status")
#### 11. Add dashboard for system or custom metric
(menu: Graph)
![graph](readme/prometheus-2.png "Graph")
#### 12. Open Grafana dashboard
and add Prometheus data source
![add_source](readme/grafana-1.png "Add source")
#### 13. Create new dashboard for system or custom metric
![add_metric](readme/grafana-2.png "Add metric")
or use already created.
For example:
[JVM (Micrometer) Dashboard](https://grafana.com/dashboards/4701)
## Working example
### Status
[![Build Status](https://travis-ci.org/wojciech-zurek/kotlin-spring-boot-prometheus-grafana-example.svg?branch=master)](https://travis-ci.org/wojciech-zurek/kotlin-spring-boot-prometheus-grafana-example)
### Requirements
Docker and docker-compose
### Stack
- kotlin
- gradle
- spring boot 2.0 (mvc, actuator)
- docker and docker-compose (I'm using Linux)
- prometheus
- grafana
### Download
```bash
git clone https://github.com/wojciech-zurek/kotlin-spring-boot-prometheus-grafana-example.git
```
### How to build and run
#### 1. Compile project and build docker images
```bash
cd kotlin-spring-boot-prometheus-grafana-example/
./build.sh
```
#### 2. Check docker images
```bash
docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
eu.wojciechzurek/spring-boot-example-metric 0.0.1 3499dee25307 About an hour ago 123MB
eu.wojciechzurek/spring-boot-example-metric latest 3499dee25307 About an hour ago 123MB
eu.wojciechzurek/custom-prometheus 0.0.1 17aaa34718de About an hour ago 112MB
eu.wojciechzurek/custom-prometheus latest 17aaa34718de About an hour ago 112MB
grafana/grafana latest 18cae91912fc 5 days ago 301MB
```
#### 3. Run docker-compose
```bash
cd kotlin-spring-boot-prometheus-grafana-example/
docker-compose up
```
#### 4. Check docker containers status
```bash
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
449f44b8a1d5 eu.wojciechzurek/custom-prometheus:latest "/bin/prometheus --c…" 41 minutes ago Up 41 minutes 0.0.0.0:9090->9090/tcp kotlinspringbootprometheusgrafanaexample_prometheus_1
348e2b0150be grafana/grafana "/run.sh" 41 minutes ago Up 41 minutes 0.0.0.0:3000->3000/tcp kotlinspringbootprometheusgrafanaexample_grafana_1
ab91021df26b eu.wojciechzurek/spring-boot-example-metric:latest "java -Djava.securit…" 41 minutes ago Up 41 minutes 0.0.0.0:8080->8080/tcp spring-boot-example-metric
```
#### 5. Services URL
- example spring boot controller: http://localhost:8080/message
- Prometheus dashboard http://localhost:9000/
- Grafana dashboard (**login: admin, password: admin**) http://localhost:3000/
#### 6. Open Grafana dashboard
and add Prometheus type, direct access source (HTTP URL: http://localhost:9000/)
#### 7. Test app and check metrics
```bash
curl http://localhost:8080/message
Hello Spring World%
```
#### 8. Optional upload example dashboard
Go to: [http://localhost:3000/dashboard/import](http://localhost:3000/dashboard/import) and import [grafana-spring-boot-example.json](/grafana-spring-boot-example.json) | 39.953052 | 1,204 | 0.706463 | eng_Latn | 0.309401 |
3b9e5e9cd18a1626ed80e5a000083351033e7a9d | 3,227 | md | Markdown | articles/mobile-engagement/mobile-engagement-ios-release-notes.md | OpenLocalizationTestOrg/azure-docs-pr15_pl-PL | 18fa7535e7cdf4b159e63a40776995fa95f1f314 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/mobile-engagement/mobile-engagement-ios-release-notes.md | OpenLocalizationTestOrg/azure-docs-pr15_pl-PL | 18fa7535e7cdf4b159e63a40776995fa95f1f314 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/mobile-engagement/mobile-engagement-ios-release-notes.md | OpenLocalizationTestOrg/azure-docs-pr15_pl-PL | 18fa7535e7cdf4b159e63a40776995fa95f1f314 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | <properties
pageTitle="Azure iOS zaangażowania Mobile SDK wersji | Microsoft Azure"
description="Najnowszych aktualizacji i procedury dla systemu iOS SDK dla zaangażowania Mobile Azure"
services="mobile-engagement"
documentationCenter="mobile"
authors="piyushjo"
manager="erikre"
editor="" />
<tags
ms.service="mobile-engagement"
ms.workload="mobile"
ms.tgt_pltfrm="mobile-ios"
ms.devlang="objective-c"
ms.topic="article"
ms.date="09/12/2016"
ms.author="piyushjo" />
#<a name="azure-mobile-engagement-ios-sdk-release-notes"></a>Informacje o wersji Azure iOS zaangażowania Mobile SDK
##<a name="400-09122016"></a>4.0.0 (2016-09-12)
- Stałe powiadomienie nie actioned na urządzeniach z systemem iOS 10.
- Oznaczanie jako przestarzałego XCode 7.
##<a name="324-06302016"></a>3.2.4 (2016-06-30)
- Stały agregacji między dzienniki techniczne i inne dzienniki.
##<a name="323-06072016"></a>3.2.3 (2016-06-07)
- Stałe błędu którym dostarczenia opinii nie jest zgłoszony podczas aplikacja działa w tle.
- Zoptymalizowane, wysyłanie techniczne dzienników.
##<a name="322-04072016"></a>3.2.2 (2016-04-07)
- Stały błędu na anulowanie żądania HTTP, który czasami powoduje awarię.
##<a name="321-12112015"></a>3.2.1 (2015-12-11)
- Stałe opóźnienie wyzwolenia nowego wystąpienia aplikacji przez powiadomienie o głębokości łącza
##<a name="320-10082015"></a>3.2.0 (2015-10-08)
- Włączone Bitcode w zestawie SDK, aby nadać mu Praca z **Xcode 7**.
- Stały błędy związane z powiadomienia w aplikacji app.
- Powiadomienia w aplikacji wprowadzone niezawodne w przypadku baterii i inne scenariusze tego typu.
- Usunięte dodatkowe konsoli dzienniki generowane przez 3 biblioteki stron.
##<a name="310-08262015"></a>3.1.0 (2015-08-26)
- Naprawianie błędu zgodności iOS 9 z biblioteką programu innej firmy. Została ona powoduje awarię podczas wysyłania sprawdza wyniki, informacje o aplikacji lub dodatkowe dane.
##<a name="300-06192015"></a>3.0.0 (2015-06-19)
- Telefon komórkowy zaangażowania używa odbiorcze powiadomienia Push.
- Porzucone pomocy technicznej dla systemu iOS 4.X. Począwszy od tej wersji docelowej wdrażania aplikacji musi mieć co najmniej iOS 6.
##<a name="220-05212015"></a>2.2.0 (2015-05-21)
- Identyfikator urządzenia zaangażowania Mobile dla urządzeń < iOS 6 teraz jest oparty na identyfikator GUID wygenerowane podczas instalacji.
##<a name="210-04242015"></a>2.1.0 (2015-04-24)
- Dodano szybkiej zgodności.
- Kliknięcie powiadomienia o akcji, który jest używany adres URL teraz wykonać prawo, po otwarciu aplikacji.
- Dodano brakujący plik nagłówka w pakiecie SDK.
- Rozwiązanie problemu związanego wyłączenie reporter awarie zaangażowania Mobile.
##<a name="200-02172015"></a>2.0.0 (2015-02-17)
- Początkowy wersji Azure zaangażowania urządzeń przenośnych
- Identyfikator aplikacji/sdkKey konfiguracji zastępuje Konfiguracja parametry połączenia.
- Usunięte interfejsu API do wysyłania i odbierania wiadomości XMPP dowolnego z dowolnego jednostek XMPP.
- Usunięte interfejsu API do wysyłania i odbierania wiadomości między urządzeniami.
- Ulepszenia zabezpieczeń.
- Śledzenie SmartAd usunięte.
| 41.371795 | 178 | 0.749613 | pol_Latn | 0.999806 |
3ba1596b93976781fdf4c2a40cb6986f092baa84 | 955 | md | Markdown | container/docker.md | kawakami-o3/til | 0c21499ef5aed237aa0d2f839069cfe87a8e1fe5 | [
"MIT"
] | 1 | 2019-04-16T14:37:06.000Z | 2019-04-16T14:37:06.000Z | container/docker.md | kawakami-o3/til | 0c21499ef5aed237aa0d2f839069cfe87a8e1fe5 | [
"MIT"
] | null | null | null | container/docker.md | kawakami-o3/til | 0c21499ef5aed237aa0d2f839069cfe87a8e1fe5 | [
"MIT"
] | null | null | null | # docker
https://docs.docker.com/get-started/
https://jawsdays2019.jaws-ug.jp/session/1527/
## 逆引きtips
### イメージビルド
```
cd dockerfile-dir
docker build -t image-name:image-tag .
```
### コンテナ起動
```
docker run --rm -it --init -p 3000:3000 -v from-dir:to-dir -u `id -u`:`id -g` image-name:image-tag
```
* `--rm` : 終了時にコンテナ削除
* `--it` : `-i` と `-t`. 入手力を保持. 逆は `-d`.
* `-p` : ポート指定
* `-v` : ディレクトリ指定
* `-u` : UID, GID 指定
### イメージ削除
```
docker image rm image-id
docker image prune
```
### コンテナ削除
```
docker container rm container-id
docker container prune
```
## docker run が失敗するケース1
```
docker: Error response from daemon: failed to create endpoint pedantic_margulis on network bridge: failed to add the host (...) <=> sandbox (...) pair interfaces: operation not supported.
```
というエラーが出る場合は、カーネル更新が行われた可能性がある。再起動後に再度 `docker run` を試してみると良い。
* https://qastack.jp/server/738773/docker-failed-to-add-the-pair-interfaces-operation-not-supported
| 17.685185 | 187 | 0.66911 | yue_Hant | 0.313035 |
3ba173df9964e6ff2d21f2671c9af7922613d52c | 520 | md | Markdown | docs/javascript/DailyNote/README.md | Laulou-hsh/VuePressBlog | 953bdc72683e6acd757cc00914eba1450a2998b6 | [
"MIT"
] | 1 | 2021-01-18T09:01:06.000Z | 2021-01-18T09:01:06.000Z | docs/javascript/DailyNote/README.md | Laulou-hsh/VuePressBlog | 953bdc72683e6acd757cc00914eba1450a2998b6 | [
"MIT"
] | 7 | 2020-11-06T11:55:25.000Z | 2020-12-07T03:32:35.000Z | docs/javascript/DailyNote/README.md | githubsgeek/VuePressBlog | 953bdc72683e6acd757cc00914eba1450a2998b6 | [
"MIT"
] | null | null | null | ---
title: 日常笔记
sidebar: auto
---
<style>
.go-to-top {
display: block !important;
}
</style>
* 2021年07月03日
> [一行 JS 实现功能的代码](2021703.md)
>
* 2021年06月18日
> [理解JS闭包9大使用场景](20210618.md)
* 2021年04月23日
> [字节:如何模拟实现 new 操作符](20210423.md)
* 2021年03月16日
> [常用的前端JavaScript方法封装](20210316.md)
* 2021年03月02日
> [JS语法 ES6、ES7、ES8、ES9、ES10、ES11、ES12新特性](20210302.md)
* 2021年02月20日
> [JavaScript正则表达式](20210220.md)
* 2021年01月15日
> [JavaScript防抖和节流](20210115.md)
* 2020年12月23日
> [8个常用的JavaScript数组方法](20201223.md) | 16.25 | 55 | 0.694231 | yue_Hant | 0.206253 |
3ba291c0ed576a06862b2ac3d48dac7ef271215a | 64 | md | Markdown | campus/README.md | MuhammadZubair210/InstagramExpo | 30369bd95a63cd362cb888726cbc9525152578a9 | [
"MIT"
] | null | null | null | campus/README.md | MuhammadZubair210/InstagramExpo | 30369bd95a63cd362cb888726cbc9525152578a9 | [
"MIT"
] | 2 | 2021-09-01T06:20:08.000Z | 2021-09-01T16:45:03.000Z | campus/README.md | MuhammadZubair210/InstagramExpo | 30369bd95a63cd362cb888726cbc9525152578a9 | [
"MIT"
] | null | null | null | Install Libs:
```sh
yarn
```
Run with:
```sh
exp start
```
| 5.333333 | 13 | 0.53125 | eng_Latn | 0.895749 |
3ba32569b47725028aff604bcf1272b48543f9ce | 2,443 | md | Markdown | docs/extensibility/debugger/reference/idebugcanstopevent2-canstop.md | MicrosoftDocs/visualstudio-docs.tr-tr | ff0c41f814d042e7d4a0e457839db4a191a59f81 | [
"CC-BY-4.0",
"MIT"
] | 7 | 2018-09-14T23:12:51.000Z | 2021-08-22T21:23:28.000Z | docs/extensibility/debugger/reference/idebugcanstopevent2-canstop.md | huriyilmaz/visualstudio-docs.tr-tr | 9459e8aaaeb3441455be384a2b011dbf306ce691 | [
"CC-BY-4.0",
"MIT"
] | 7 | 2018-07-20T23:01:49.000Z | 2021-04-15T20:00:12.000Z | docs/extensibility/debugger/reference/idebugcanstopevent2-canstop.md | huriyilmaz/visualstudio-docs.tr-tr | 9459e8aaaeb3441455be384a2b011dbf306ce691 | [
"CC-BY-4.0",
"MIT"
] | 22 | 2018-01-11T11:53:37.000Z | 2022-03-06T16:38:31.000Z | ---
description: Hata ayıklama altyapısına (DE) geçerli kod konumunda durdurulup durdurulmayacağını bildirir veya yürütmeye devam edin.
title: 'IDebugCanStopEvent2:: CanStop | Microsoft Docs'
ms.date: 11/04/2016
ms.topic: reference
f1_keywords:
- IDebugCanStopEvent2::CanStop
helpviewer_keywords:
- IDebugCanStopEvent2::CanStop
ms.assetid: 7d61adbe-6b3d-41f3-86a1-45d9cc01a7f8
author: leslierichardson95
ms.author: lerich
manager: jmartens
ms.technology: vs-ide-debug
ms.workload:
- vssdk
dev_langs:
- CPP
- CSharp
ms.openlocfilehash: dd575d6bb1afdf296eff6ec3ac3a08a9551618b8
ms.sourcegitcommit: b12a38744db371d2894769ecf305585f9577792f
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 09/13/2021
ms.locfileid: "126636369"
---
# <a name="idebugcanstopevent2canstop"></a>IDebugCanStopEvent2::CanStop
Hata ayıklama altyapısına (DE) geçerli kod konumunda durdurulup durdurulmayacağını bildirir veya yürütmeye devam edin.
## <a name="syntax"></a>Sözdizimi
```cpp
HRESULT CanStop (
BOOL fCanStop
);
```
```csharp
int CanStop (
int fCanStop
);
```
## <a name="parameters"></a>Parametreler
`fCanStop`\
'ndaki `TRUE`Geçerli kod konumunda durmalı sıfır olmayan (), aksi takdirde sıfır ( `FALSE` ).
## <a name="return-value"></a>Dönüş Değeri
Başarılı olursa, döndürür `S_OK` ; Aksi takdirde, bir hata kodu döndürür.
## <a name="remarks"></a>Açıklamalar
Bu olayın alıcısı genellikle [GetReason](../../../extensibility/debugger/reference/idebugcanstopevent2-getreason.md) yöntemini çağırarak, onun durdurmak istediği nedeni tespit edin ve ardından `IDebugCanStopEvent2::CanStop` yöntemi uygun Yanıtla çağırır.
DE duruyorsa, durdurma nedenini açıklayan bir olay gönderir. Tipik olarak gönderilen iki olay, [IDebugBreakEvent2](../../../extensibility/debugger/reference/idebugbreakevent2.md) arabirimi tarafından temsil edilen bir kullanıcı veya sinyal kesmesi ve [IDebugBreakpointEvent2](../../../extensibility/debugger/reference/idebugbreakpointevent2.md) arabirimi tarafından temsil edilen bir kesme noktası olayı vardır.
## <a name="see-also"></a>Ayrıca bkz.
- [IDebugCanStopEvent2](../../../extensibility/debugger/reference/idebugcanstopevent2.md)
- [IDebugBreakEvent2](../../../extensibility/debugger/reference/idebugbreakevent2.md)
- [IDebugBreakpointEvent2](../../../extensibility/debugger/reference/idebugbreakpointevent2.md)
- [GetReason](../../../extensibility/debugger/reference/idebugcanstopevent2-getreason.md)
| 40.04918 | 412 | 0.784282 | tur_Latn | 0.951173 |
3ba47408992bad9efc5313bfd94703ad4e1eea2b | 40 | md | Markdown | README.md | oguzhancokyilmaz/Symfony_KurumsalSite_2013010213025 | 95e0add8b19aa43e32a9d621f55086e971edeabd | [
"MIT"
] | null | null | null | README.md | oguzhancokyilmaz/Symfony_KurumsalSite_2013010213025 | 95e0add8b19aa43e32a9d621f55086e971edeabd | [
"MIT"
] | null | null | null | README.md | oguzhancokyilmaz/Symfony_KurumsalSite_2013010213025 | 95e0add8b19aa43e32a9d621f55086e971edeabd | [
"MIT"
] | null | null | null | "# Symfony_KurumsalSite_2013010213025"
| 20 | 39 | 0.85 | nob_Latn | 0.167312 |
3ba4f363be48ffe347c0bfc7c45f1972e82a36c5 | 2,793 | md | Markdown | _posts/2016-11-03-when-genetic-tests-are-basically-horoscopes.md | drgarybennett/drgarybennett.github.io | c302adf1e9c03adb64dcc7fbfe12cdb0c77d587a | [
"MIT"
] | null | null | null | _posts/2016-11-03-when-genetic-tests-are-basically-horoscopes.md | drgarybennett/drgarybennett.github.io | c302adf1e9c03adb64dcc7fbfe12cdb0c77d587a | [
"MIT"
] | null | null | null | _posts/2016-11-03-when-genetic-tests-are-basically-horoscopes.md | drgarybennett/drgarybennett.github.io | c302adf1e9c03adb64dcc7fbfe12cdb0c77d587a | [
"MIT"
] | null | null | null | ---
id: 1274
title: When genetic tests are basically horoscopes
date: 2016-11-03T11:27:18+00:00
author: ggbjr
layout: post
guid: http://drgarybennett.com/?p=1274
permalink: /2016/11/03/when-genetic-tests-are-basically-horoscopes/
categories:
- cool tech
- digital health
---
![ ](https://images.unsplash.com/photo-1515942661900-94b3d1972591?ixlib=rb-0.3.5&q=85&fm=jpg&crop=entropy&cs=srgb&dl=josh-rangel-513683-unsplash.jpg&s=e7e833db3a37747de37170975c3f078a)
Fantastic piece [in today’s Inside Stat](https://www.statnews.com/2016/11/03/genetic-testing-fitness-nutrition/).
> The tips I got back were almost comically generic. One piece of advice from Kinetic Diagnostics on how to compensate for my increased risk of muscle cramping? “Do proper stretching and muscle warm ups before and after exercise.”
>
> DNAFit’s recommendation to make up for a variant that predisposes me to to see fewer gains from endurance training? “Stay sufficiently hydrated.”
>
> Kinetic Diagnostics said I was at elevated risk of high blood pressure; DNAFit said I was likely to experience fewer problems with blood pressure. They both offered the same advice, supposedly tailored to my genotype: exercise.
>
> (When I later asked them about this recommendation, the companies acknowledged that such advice could benefit anyone but insisted that people with my genotype would find it especially useful.)
I suspect that this will mostly be interpreted as an indictment of the athletics genetic testing “industry.” And, they seem to deserve it. But there’s a bigger issue here: many similar companies enter the market with laughably limited evidence that their “personalized recommendations” are actually informed by science.
> Then there were the interpretations that flat-out contradicted one another.
>
> The tests each looked at different regions of my genome — which may have been necessary to distinguish themselves from their competitors, but which in and of itself suggests just how much this field is in its infancy. So it wasn’t possible to compare the complete results from each company head-to-head.
>
> But among the scores of data points, I found 20 genetic variants that showed up on two or more test results. The companies all gave me the same genetic readout on those variants, so I have little doubt they correctly analyzed the cells in the cheek swab I’d sent them. In six cases, however, the interpretation I got from one company directly contradicted the interpretation from another.
I’m sensitive to the idea that [the long time it takes to generate] evidence frequently slows the process of bringing innovative tools to market. However, this is a helpful reminder that speed can also disadvantage consumers (while rewarding founders).
| 82.147059 | 390 | 0.794128 | eng_Latn | 0.999145 |
3ba67ddbef29ef0d7816e0238b52f0f792fb7b14 | 56 | md | Markdown | notes.md | Suranjandas7/Tropicana | fcfe020ba95cb6d0bbc597b1aa99d5076e9093b2 | [
"BSD-3-Clause"
] | null | null | null | notes.md | Suranjandas7/Tropicana | fcfe020ba95cb6d0bbc597b1aa99d5076e9093b2 | [
"BSD-3-Clause"
] | null | null | null | notes.md | Suranjandas7/Tropicana | fcfe020ba95cb6d0bbc597b1aa99d5076e9093b2 | [
"BSD-3-Clause"
] | null | null | null | To-do:
1. Write a testing framework
2. Implement Pandas
| 14 | 28 | 0.767857 | eng_Latn | 0.566859 |
3ba6b8f050892f0325d5ecffb32b7a940c216649 | 543 | md | Markdown | README.md | asaladino/smart-form-helper | 9df95386b559c262ae8a0d6eaed76f3f52d08197 | [
"MIT"
] | null | null | null | README.md | asaladino/smart-form-helper | 9df95386b559c262ae8a0d6eaed76f3f52d08197 | [
"MIT"
] | null | null | null | README.md | asaladino/smart-form-helper | 9df95386b559c262ae8a0d6eaed76f3f52d08197 | [
"MIT"
] | null | null | null | # Smart Form Helper
Helps generate c# from an Ektron smart form xsd. This is useful when you have a widget that is
dependant on a smart form. This app is as is and will never be updated. In fact, may your god
have mercy on you if you have Ektron (or any of it's incarnations) as your cms. There are
plenty of real cms' out there, they are free and superior.
![alt text](notes/filled-in.png "Smart Form Helper")
One item to note, make sure you know where the xsd.exe executable is.
![alt text](notes/configuration.png "Configuration")
| 36.2 | 95 | 0.747698 | eng_Latn | 0.999476 |
3ba708a520c5f6baf4a26552c9b733f2fbd7aeaa | 11 | md | Markdown | README.md | Musty14/rest | a4d1e2b5ed05f28c994c0891978c72207adbabfc | [
"MIT"
] | null | null | null | README.md | Musty14/rest | a4d1e2b5ed05f28c994c0891978c72207adbabfc | [
"MIT"
] | null | null | null | README.md | Musty14/rest | a4d1e2b5ed05f28c994c0891978c72207adbabfc | [
"MIT"
] | null | null | null | # rest
Dag
| 3.666667 | 6 | 0.636364 | eng_Latn | 0.930999 |
3ba78e13daa665e43f466db0cbc834da1b9c4673 | 1,039 | md | Markdown | doc/dev-clone-compile.md | russelldb/machi | e87bd59a9777d805b00f9e9981467eb28e28390c | [
"Apache-2.0"
] | 143 | 2015-03-02T16:37:27.000Z | 2022-02-16T00:54:42.000Z | doc/dev-clone-compile.md | russelldb/machi | e87bd59a9777d805b00f9e9981467eb28e28390c | [
"Apache-2.0"
] | 60 | 2015-08-06T03:58:12.000Z | 2016-03-29T11:07:09.000Z | doc/dev-clone-compile.md | russelldb/machi | e87bd59a9777d805b00f9e9981467eb28e28390c | [
"Apache-2.0"
] | 27 | 2015-03-03T14:22:13.000Z | 2021-06-04T07:43:15.000Z | # Clone and compile Machi
Clone the Machi source repo and compile the source and test code. Run
the following commands at your login shell:
cd /tmp
git clone https://github.com/basho/machi.git
cd machi
git checkout master
make # or 'gmake' if GNU make uses an alternate name
Then run the unit test suite. This may take up to two minutes or so
to finish.
make test
At the end, the test suite should report that all tests passed. The
actual number of tests shown in the "All `X` tests passed" line may be
different than the example below.
[... many lines omitted ...]
module 'event_logger'
module 'chain_mgr_legacy'
=======================================================
All 90 tests passed.
If you had a test failure, a likely cause may be a limit on the number
of file descriptors available to your user process. (Recent releases
of OS X have a limit of 1024 file descriptors, which may be too slow.)
The output of the `limit -n` will tell you your file descriptor limit.
| 33.516129 | 70 | 0.680462 | eng_Latn | 0.996281 |
3ba7996301c07aff41d4193e357eaaf1a4c4701e | 1,173 | md | Markdown | README.md | PMheart/CPUFriend | 72124c1d0558651e2ad931d0ff9c76ad38be133e | [
"BSD-3-Clause"
] | 75 | 2017-08-06T19:12:10.000Z | 2018-09-24T22:48:25.000Z | README.md | PMheart/CPUFriend | 72124c1d0558651e2ad931d0ff9c76ad38be133e | [
"BSD-3-Clause"
] | 7 | 2018-04-22T21:08:41.000Z | 2018-09-22T22:51:58.000Z | README.md | PMheart/CPUFriend | 72124c1d0558651e2ad931d0ff9c76ad38be133e | [
"BSD-3-Clause"
] | 15 | 2017-08-07T08:52:12.000Z | 2018-08-19T06:00:04.000Z | CPUFriend
=========
[![Build Status](https://github.com/acidanthera/CPUFriend/workflows/CI/badge.svg?branch=master)](https://github.com/acidanthera/CPUFriend/actions) [![Scan Status](https://scan.coverity.com/projects/16841/badge.svg?flat=1)](https://scan.coverity.com/projects/16841)
A [Lilu](https://github.com/acidanthera/Lilu) plug-in for dynamic power management data injection.
#### Notes
This repository must be compiled with latest [Lilu](https://github.com/acidanthera/Lilu) and [MacKernelSDK](https://github.com/acidanthera/MacKernelSDK), otherwise the compilation will fail!
Note: ***Debug version of Lilu.kext and MacKernelSDK project folder should be put in the same folder as CPUFriend! And debug versions of Lilu and CPUFriend should also be used together when debugging!***
#### Configuration
See [Instructions](https://github.com/acidanthera/CPUFriend/blob/master/Instructions.md) for more details.
#### Credits
- [Apple](https://www.apple.com) for macOS
- [vit9696](https://github.com/vit9696) for [Lilu.kext](https://github.com/acidanthera/Lilu) and various helps
- [PMheart](https://github.com/PMheart) for writing the software and maintaining it
| 58.65 | 264 | 0.765558 | eng_Latn | 0.37765 |
3ba844ccee72ed6dd5b5f8cd3aa5e34e153b3c8d | 1,049 | md | Markdown | src/dinos/psittacosaurus.md | mhaack/mias-dino-facts | f7369b43ef317b94f0b25b433ae7e9ee36f6a375 | [
"MIT"
] | 1 | 2021-12-19T15:50:11.000Z | 2021-12-19T15:50:11.000Z | src/dinos/psittacosaurus.md | mhaack/mias-dino-facts | f7369b43ef317b94f0b25b433ae7e9ee36f6a375 | [
"MIT"
] | 6 | 2022-01-22T11:14:51.000Z | 2022-02-03T21:28:17.000Z | src/dinos/psittacosaurus.md | mhaack/mias-dino-facts | f7369b43ef317b94f0b25b433ae7e9ee36f6a375 | [
"MIT"
] | null | null | null | ---
title: Psittacosaurus
namesuffix: mongoliensis
meaning: Papageienechse
date: 2021-12-25
image: /img/dinos/psittacosaurus.jpg
tags:
- Herbivore 🌿
- Kreide 🦴
- Lieblingsdino ⭐
food: Herbivore
location: Asien
locations:
- MNG
finder: Mitglieder des New Yorker American Museum of Natural History
weight: 50kg
size: 2m
years: 130-100
group: Ceratopsia
---
Der **Psittacosaurus** ist mit dem [Triceratops ](/dinos/triceratops/)und dem [Kosmoceratops ](/dinos/kosmoceratops)verwand, denn sie gehören alle zu der Ceratopsia Gruppe. Der **Psittacosaurus** hatte einen Schnabel wie ein Papagei.
![Papagei ](/img/dinos/papagei.jpg)
Der **Psittacosaurus** hat auch einen Schwanz der mit Stacheln bedeckt war, ähnlich wie bei einem Stachelschwein, mit dem er sich vermutlich verteidigt hat.
![Stachelschwein](/img/dinos/stachelschwein.jpg)
Vermutlich hat der **Psitacosaurus** Steine gefressen, damit sie die Blätter und andere Pflanzenteile zermahlten.
Quellen:
* <https://piqs.de>
* <https://www.flickr.com>
* <https://culturacientifica.com> | 29.971429 | 233 | 0.765491 | deu_Latn | 0.940848 |
3ba910544e522697b8fa3571f82b2ff5f7b733e6 | 3,123 | md | Markdown | readme.md | Supermegadex/compy | 5c72dc1b887c7b4a120182c5c1f7f9c210404ed7 | [
"MIT"
] | null | null | null | readme.md | Supermegadex/compy | 5c72dc1b887c7b4a120182c5c1f7f9c210404ed7 | [
"MIT"
] | 1 | 2021-09-01T05:28:24.000Z | 2021-09-01T05:28:24.000Z | readme.md | danielnoon/compy | 5c72dc1b887c7b4a120182c5c1f7f9c210404ed7 | [
"MIT"
] | null | null | null | # Compy
**You**: What the hell is this? This isn't an operating system!
**Me**: Yep.
### What is this?
Compy is my attempt at writing a functional ABI from scratch in JavaScript (TypeScript). You can check out the "processor" instructions in `src/Process.ts` and system calls in `src/OS.ts`.
Once completed, Compy *should* work as a full-fledged application host that can run arbitrary binary code compiled for the virtual machine. I am using x86 as a standard for what opcodes should be included.
I think what I'm going to end up doing with this is building out the kernel into something that applications can interface with to achieve productive things.
A few things that the kernel needs to support in order for this to happen:
1. filesystem (open, close, read, write)
2. process spawning (fork, execve, clone)
3. i/o
4. networking
### Assembly
Because I want to, I am writing an assembler for the project. The asm syntax should looks familiar if you've written asm before. The biggest difference from familiar grammars stemmed from my annoyance at Intel/AT&T differences in parameter order: I introduced an arrow to indicate source/destination instead of relative position. A full example is in `test-prog.asm`, but here's an overview:
#### Instructions
Like your favorite flavor of asm, each line is one instruction and its parameters
(Note: I am writing the line numbers in parentheses because it looks nice. It is not syntactically valid).
```
(1) mov 3 -> (eax)
(2) mov (ebx) <- (eax)
(3) mov 1 -> [0]
(4) mov 6 -> [(ebx)]
```
In this trivial example, you can see the effects of the arrow syntax. Source is at the tail of the arrow and the destination is at the head. Direction can be mixed throughout a program, which could be deemed a bad thing, but I think that the arrow is obvious enough for glaceability.
You will also notice the different ways of addressing memory and registers. Registers are denoted with parentheses. Fun tip: eax/ebx/ecx/edx are just mnemonics for numbers. You can use numbers in place of the names if you want (i.e. `(0)` for eax). Memory is addressed in square brackets, and the values in registers can be used as memory addresses through `[(reg)]` syntax.
#### Labels and branching
Jumps can be made like any other asm syntax. Labels are defined with `label:` and passed to jumps with `.label`.
```
(1) loop:
(2) inc (ecx)
(3) cmp (ecx) 5
(4) jeq .loop
```
This example should loop five times before exiting.
#### System Calls
Because there's really just one software interrupt that's used, I didn't see a need to define which interrupt to call. Just use the `int` instruction without any parameters.
```
(1) mov 1 -> (eax) ; syscall 1 is print number
(2) mov 5 -> (ebx)
(3) int
```
This should print "5" to the host console.
#### Assembling
I have written a command line tool to compile the asm to bytecode. To install, run `npm link` in this directory. To run, execute `compy -a <path to asm>`.
## Summary
This is dumb and just something fun for me to do. Unless something compels you to work on this for fun in your free time, you can just ignore this.
| 46.61194 | 391 | 0.741595 | eng_Latn | 0.999671 |
3ba943c7d7c7efa5b60eed15007c7085865784a0 | 838 | md | Markdown | docs/framework/wcf/diagnostics/tracing/system-identitymodel-selectors-generalinformation.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/diagnostics/tracing/system-identitymodel-selectors-generalinformation.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/diagnostics/tracing/system-identitymodel-selectors-generalinformation.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: System.IdentityModel.Selectors.GeneralInformation
ms.date: 03/30/2017
ms.assetid: 60eff0ce-cf24-49d0-bc62-66bc8f684322
ms.openlocfilehash: a8cedfc9751e029616a6b30f19add1043a9b7d36
ms.sourcegitcommit: bc293b14af795e0e999e3304dd40c0222cf2ffe4
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 11/26/2020
ms.locfileid: "96270999"
---
# <a name="systemidentitymodelselectorsgeneralinformation"></a>System.IdentityModel.Selectors.GeneralInformation
System.IdentityModel.Selectors.GeneralInformation
## <a name="description"></a>Descrição
Essas são informações gerais.
## <a name="see-also"></a>Veja também
- [Rastreamento](index.md)
- [Utilizando o rastreamento para solucionar problemas em seu aplicativo](using-tracing-to-troubleshoot-your-application.md)
- [Administração e diagnóstico](../index.md)
| 33.52 | 124 | 0.800716 | por_Latn | 0.264366 |
3ba9b2bd57defa1083b23266f0291cbab20d5c20 | 61 | md | Markdown | docs_src/README.md | HHeda/qucat | f369cddde1d4041e53e9d524d06f2b1ed59b92f6 | [
"MIT"
] | 2 | 2020-01-13T02:52:30.000Z | 2021-08-11T12:25:07.000Z | docs_src/README.md | HHeda/qucat | f369cddde1d4041e53e9d524d06f2b1ed59b92f6 | [
"MIT"
] | null | null | null | docs_src/README.md | HHeda/qucat | f369cddde1d4041e53e9d524d06f2b1ed59b92f6 | [
"MIT"
] | 1 | 2021-05-18T03:37:33.000Z | 2021-05-18T03:37:33.000Z | To generate the documentation (on Windows), run make_html.bat | 61 | 61 | 0.819672 | eng_Latn | 0.655337 |
3baa53fe5796902727049ac79186a2a4b3de3013 | 595 | md | Markdown | mip-sytown-ad/README.md | lxhmm921/mip-extensions-platform | 9d2f4aa8d4708812c902b81469fa1d056c2bbabc | [
"MIT"
] | 35 | 2017-07-07T01:15:46.000Z | 2020-06-28T06:26:57.000Z | mip-sytown-ad/README.md | izzhip/mip-extensions-platform | d84c2297d6b3ced1d4cd4415ba6df03dad251609 | [
"MIT"
] | 48 | 2017-02-15T11:01:58.000Z | 2019-05-22T03:05:38.000Z | mip-sytown-ad/README.md | izzhip/mip-extensions-platform | d84c2297d6b3ced1d4cd4415ba6df03dad251609 | [
"MIT"
] | 86 | 2017-03-02T06:39:22.000Z | 2020-11-02T06:49:31.000Z | # mip-sytown-ad
mip-sytown-ad 尚一堂广告组件
标题|内容
----|----
类型|通用
支持布局|N/S
所需脚本|https://c.mipcdn.com/static/v1/mip-sytown-ad/mip-sytown-ad.js
## 示例
### 普通用法
```html
<mip-sytown-ad ad-id="详情id" type="类型"></mip-sytown-ad>
```
### 嵌套mip-ad
```html
<mip-sytown-ad ad-id="详情id" type="类型">
<div class="mip-adbd">
<mip-ad layout="responsive" type="ad-baidu" cproid="u1234567"></mip-ad>
</div>
</mip-sytown-ad>
```
## 属性
### ad-id
说明:详情页的id
必选项:是
类型:字符串
### type
说明:文章类型
必选项:是
类型:数字
取值范围:文章:1, 视频:2, 音频:3
## 注意事项
mip-sytown-ad中嵌套mip-ad组件,默认展示mip-ad广告, 当接口返回数据后显示mip-sytown-ad广告
| 12.395833 | 76 | 0.620168 | yue_Hant | 0.099819 |
3baa9a8b2478e68dfe907d3353ac0af5673896fc | 41 | md | Markdown | README.md | ignaciorod1/Hidalgo-head | dcc25a882d7161f7dea5922a490bb55f232dc00a | [
"MIT"
] | null | null | null | README.md | ignaciorod1/Hidalgo-head | dcc25a882d7161f7dea5922a490bb55f232dc00a | [
"MIT"
] | null | null | null | README.md | ignaciorod1/Hidalgo-head | dcc25a882d7161f7dea5922a490bb55f232dc00a | [
"MIT"
] | null | null | null | # Hidalgo-head
Control of Hidalgo's head
| 13.666667 | 25 | 0.780488 | eng_Latn | 0.971972 |
3babc84c42c874c7ce17c48fd0c3b7c83cd75031 | 1,372 | md | Markdown | src/pages/live-performances/ten-min-devised-piece.md | casperleerink/yifan-gong | 0cc8b29353fac24e5b3e282ec931311e8f0ce1ef | [
"MIT"
] | null | null | null | src/pages/live-performances/ten-min-devised-piece.md | casperleerink/yifan-gong | 0cc8b29353fac24e5b3e282ec931311e8f0ce1ef | [
"MIT"
] | null | null | null | src/pages/live-performances/ten-min-devised-piece.md | casperleerink/yifan-gong | 0cc8b29353fac24e5b3e282ec931311e8f0ce1ef | [
"MIT"
] | null | null | null | ---
templateKey: work-page
title: 10-min Devised Piece (Theatre Live Recording) (2018)
date: 2018-12-03T08:00:00.000Z
image: /assets/48038501_10157406093167985_7053698599233781760_o.jpg
---
# Full Live Recording
<div class="lines-1"></div>
**Role: Deviser, dancer, choreographer, costume & prop designer, music director**
<div class="lines-1"></div>
<div class="video-container"><iframe src="https://www.youtube.com/embed/b-fZyirWg0g" class="video" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></div>
<div class="lines-1"></div>
# Cast
<!--StartFragment-->
**Devisers:**
**Belle Hernberg-Johnson**
**Haotian(Cedric) Liu**
**Yifan Gong.**
<!--EndFragment-->
<div class="lines-1"></div>
# Making Process & Behind the Scenes
<div class="lines-1"></div>
### Behind the Scenes No.1:
**Please click the link below to watch the video of Behind the scenes No.1:**
**<https://b23.tv/gG1shO>**
<div class="lines-1"></div>
### Behind the Scenes No.2:
**Please click the link below to watch the video of Behind the scenes No.2:**
**<https://b23.tv/UeAUyV>**
<div class="lines-1"></div>
# Reflection and Documentation
<div class="lines-1"></div>
[Click here to read or download the Reflection on 10-min Devised Piece](/assets/reflection-of-project-2.pdf)
<div class="lines-1"></div> | 22.491803 | 226 | 0.701166 | eng_Latn | 0.320947 |
3bac59fc9c02d23602c705373a3e73992d74cdba | 1,147 | md | Markdown | README.md | argosopentech/lindy-news | 3bd1a421f2df4645e203c4d9d4baaf3a0c679430 | [
"MIT"
] | 1 | 2021-08-11T12:55:39.000Z | 2021-08-11T12:55:39.000Z | README.md | argosopentech/lindy-news | 3bd1a421f2df4645e203c4d9d4baaf3a0c679430 | [
"MIT"
] | null | null | null | README.md | argosopentech/lindy-news | 3bd1a421f2df4645e203c4d9d4baaf3a0c679430 | [
"MIT"
] | null | null | null | # Lindy News
## Lindy
- [In Resignation Speech, Cuomo Makes a Last Play to Preserve His Legacy](https://www.nytimes.com/2021/08/10/nyregion/cuomo-legacy-resignation-speech.html)
- [What’s in the new infrastructure bill — and why it’s a big deal](https://www.vox.com/22598883/infrastructure-deal-bipartisan-bill-biden-manchin)
## Not Lindy
- [Railing at Enemies and Pleading for Time: Inside Cuomo’s Final Days](https://www.nytimes.com/2021/08/10/us/politics/cuomo-resignation-timeline.html)
- [Gaming culture is toxic. A major lawsuit might finally change it.](https://www.vox.com/22617457/activision-blizzard-lawsuit-ubisoft-open-letter-toxic-gaming-culture)
- [Pelosi, AOC vow to block infrastructure bill in favor of $3.5T liberal wish list](https://www.foxnews.com/politics/house-senate-infrastructure-bill-pelosi-aoc)
- [Plants, Heavy Metals, and the Lingering Scars of World War I ](https://www.atlasobscura.com/articles/zone-rouge-plant-growth)
### Links
- [Inspiration](https://twitter.com/paulg/status/1425147126032519172)
- [Data](https://www.kaggle.com/therohk/million-headlines)
- [Development video](https://youtu.be/XQGDCGPACSU)
| 63.722222 | 168 | 0.770706 | yue_Hant | 0.263096 |
3badf0ab10c39f4b67dfa291c0170d88fc8a2a59 | 45 | md | Markdown | packages/ts/generator-typescript-plugin-backbone/README.md | vaadin/hilla | bf4df872a45244427b355c75854b65107af0264c | [
"Apache-2.0"
] | 164 | 2022-01-18T09:05:11.000Z | 2022-03-31T09:19:31.000Z | packages/ts/generator-typescript-plugin-model/README.md | vaadin/hilla | bf4df872a45244427b355c75854b65107af0264c | [
"Apache-2.0"
] | 190 | 2022-01-18T07:15:21.000Z | 2022-03-31T17:18:05.000Z | packages/ts/generator-typescript-plugin-model/README.md | vaadin/hilla | bf4df872a45244427b355c75854b65107af0264c | [
"Apache-2.0"
] | 9 | 2022-01-19T08:53:57.000Z | 2022-03-29T14:23:00.000Z | # Hilla TypeScript Generator Backbone Plugin
| 22.5 | 44 | 0.844444 | kor_Hang | 0.318655 |
3bb04a35d9e12045db4453a140374c0ea4f9e616 | 2,538 | md | Markdown | _posts/2014-02-13-Relationships-Are-About-Community.md | cameroneshgh/cameroneshgh.github.io | da8be1ac917adb45c8daa0fdace83a08b19774b5 | [
"MIT"
] | null | null | null | _posts/2014-02-13-Relationships-Are-About-Community.md | cameroneshgh/cameroneshgh.github.io | da8be1ac917adb45c8daa0fdace83a08b19774b5 | [
"MIT"
] | null | null | null | _posts/2014-02-13-Relationships-Are-About-Community.md | cameroneshgh/cameroneshgh.github.io | da8be1ac917adb45c8daa0fdace83a08b19774b5 | [
"MIT"
] | null | null | null | ---
layout: post
title: Relationships Are About Community
description: ''
date: '2014-02-13T16:57:05.000Z'
categories: []
keywords: []
slug: /relationships-are-about-community
---
![](../images/cookout-outdoors.jpg)
Sorry I’ve been incommunicado, friends. Life (as a whole) has been rather hectic and busy as of late. It seems this blog and my quiet times are the first things to go, and that’s a shame. But one thing that has kept coming up for me in the past couple weeks is community and its purpose. And in my churning, I’ve come to the conclusion that community is about genuine relationships and relationships are about community. But why the circular logic?<!--more-->
#### why relationships are about community
The first relationship was the Father, Son, and Holy Spirit in perfect community. Relationships are about community because of this first relationship. Because we are made in the image of God, our relationships mirror God’s relationship with Himself. Or, they’re meant to.
#### how relationships are about community
The essence of relationships is knowing the other intimately and being known similarly. It’s about disarming ourselves of the weapons of false pretenses, busyness, and facades we so often resort to. Relationships are about making each other feel loved to the point where hiding is no longer seen as necessary. Relationships are where love takes root and people connect. And what we really want is to connect, right? Allow me to qualify.
#### relationships are about community, but community is about relationships
It pains me to use such circular logic, but it does get to the point. The point of relationships is that there are no ulterior motives or schemes. A relationship is meant to be a deep connection between people. God’s relationship with Himself is a perfect form of connectedness. And it is for the purpose of the intimate connection itself, no other reason, that God engages in such deep relationships. There is value in connectedness, we need it, and so we need relationships for what they are by definition. We were made to be connected. We were made for deep, intimate, soul-baring connection. But most of us settle for activity partners.
What kind of relationships do you do you have in your life, friend? As it was always intended, relationships are about community, about connectedness. Would you classify your relationships as such? Would you say your relationships bear the image of our trinitarian God? If I’m honest with you, most of mine don’t. And I know I’m not alone. | 94 | 640 | 0.787234 | eng_Latn | 0.999912 |
3bb07b326d2389edf504ff5171d8a252ad492e63 | 35 | md | Markdown | README.md | Ingebrigtsen/Devices | 0076d481f5f1f1bdcecfa37b97686e8ee8cee524 | [
"MIT"
] | null | null | null | README.md | Ingebrigtsen/Devices | 0076d481f5f1f1bdcecfa37b97686e8ee8cee524 | [
"MIT"
] | null | null | null | README.md | Ingebrigtsen/Devices | 0076d481f5f1f1bdcecfa37b97686e8ee8cee524 | [
"MIT"
] | null | null | null | # Devices
The devices in our house
| 11.666667 | 24 | 0.771429 | eng_Latn | 0.999001 |
3bb080d23a6bef4828ede538cda5fcb7ed7a45c9 | 1,340 | md | Markdown | AlchemyInsights/microsoft-search-in-bing-and-office-365-proplus.md | isabella232/OfficeDocs-AlchemyInsights-pr.ja-JP | 0233fcd9db195fa85ad6a3c82c69c7ac57d883c3 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T19:06:59.000Z | 2020-09-17T11:26:04.000Z | AlchemyInsights/microsoft-search-in-bing-and-office-365-proplus.md | isabella232/OfficeDocs-AlchemyInsights-pr.ja-JP | 0233fcd9db195fa85ad6a3c82c69c7ac57d883c3 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T06:55:26.000Z | 2022-02-09T06:56:03.000Z | AlchemyInsights/microsoft-search-in-bing-and-office-365-proplus.md | isabella232/OfficeDocs-AlchemyInsights-pr.ja-JP | 0233fcd9db195fa85ad6a3c82c69c7ac57d883c3 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-10-11T19:20:29.000Z | 2019-10-11T19:20:29.000Z | ---
title: Bing および Microsoft 365 Apps for enterprise での Microsoft Search
ms.author: pebaum
author: pebaum
ms.audience: ITPro
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Priority
ms.openlocfilehash: 10174582fca204d0fa44af23feba0f9412e99890
ms.sourcegitcommit: c6692ce0fa1358ec3529e59ca0ecdfdea4cdc759
ms.translationtype: HT
ms.contentlocale: ja-JP
ms.lasthandoff: 09/14/2020
ms.locfileid: "47755064"
---
# <a name="microsoft-search-in-bing-and-microsoft-365-apps-for-enterprise"></a>Bing および Microsoft 365 Apps for enterprise での Microsoft Search
Google Chrome 向け Bing の拡張機能の***オプション***である Microsoft Search を、バージョン 2005 以降の Microsoft 365 Apps for enterprise (以前の Office 365 ProPlus) で使用することができます。 この拡張機能は、Windows を実行しているドメイン参加済みデバイスの特定の場所で利用することができます。 この拡張機能を展開すると、Bing が既定の検索エンジンとして設定されます。
拡張機能をインストールするには、Microsoft 365 管理センターの Microsoft Search のセクションで設定を構成して***オプトイン***する必要があります。
この***オプション***の拡張機能の展開にも、バージョン 2005 以降の Microsoft 365 Apps for enterprise に含まれているバックグラウンド サービスが必要です。 バックグラウンド サービスがインストールされていないと、拡張機能はインストールされません。
拡張機能をインストールした後でも、ユーザーはオン/オフの切り替えをするだけで、既定の検索エンジンとしての Bing の使用を簡単に停止することができます。
この ***オプション*** の拡張機能を展開する方法の詳細については、「[Bing および Microsoft 365 Apps for enterprise での Microsoft Search](https://docs.microsoft.com/deployoffice/microsoft-search-bing)」を参照してください。 | 49.62963 | 243 | 0.830597 | yue_Hant | 0.68098 |
3bb0e482a4c3a22f0446d7202c0915113cbb2299 | 3,619 | md | Markdown | README.md | ino5/jsp-team-project-in-choongang | ed94cea629a14d9f314828ff4a1a77e62919feef | [
"MIT"
] | null | null | null | README.md | ino5/jsp-team-project-in-choongang | ed94cea629a14d9f314828ff4a1a77e62919feef | [
"MIT"
] | null | null | null | README.md | ino5/jsp-team-project-in-choongang | ed94cea629a14d9f314828ff4a1a77e62919feef | [
"MIT"
] | null | null | null | # jsp-team-project-in-choongang
- 채용 공고 사이트 제작 웹 프로젝트
- 프레임워크를 활용하지 않고 java servlet과 jsp, jdbc(oracle)를 이용하여 제작하였습니다.
- 프로젝트 기간: 2021-05-17 ~ 2021-06-23
- 팀원: 안민현, 김민욱, 백인호, 임다솔, 정민희
<br>
## 기능 소개 및 ERD
![슬라이드40](https://user-images.githubusercontent.com/70236767/136733736-9837ce6c-7933-45fa-932c-6bc7d3f229aa.JPG)
![슬라이드10](https://user-images.githubusercontent.com/70236767/136733575-7af7e311-b27a-494e-b48c-fcec8f368490.JPG)
<br>
## 화면 구성
![슬라이드12](https://user-images.githubusercontent.com/70236767/136733816-b760085d-6027-4710-a5a2-0da02037e489.JPG)
![슬라이드13](https://user-images.githubusercontent.com/70236767/136733817-d4cbfdfb-c283-459e-8226-93fcf7a00644.JPG)
![슬라이드14](https://user-images.githubusercontent.com/70236767/136733819-bcd77f60-6b8f-4f61-9b69-edf7fba2a0e6.JPG)
![슬라이드15](https://user-images.githubusercontent.com/70236767/136733820-49a9b400-0e28-49fa-8b0a-b0758b9f8276.JPG)
![슬라이드16](https://user-images.githubusercontent.com/70236767/136733821-50295015-fd60-4cd5-ad9e-ada8d3dba51d.JPG)
![슬라이드17](https://user-images.githubusercontent.com/70236767/136733823-ad436f8c-3a30-4d5c-9d16-930ce3c9aba8.JPG)
![슬라이드18](https://user-images.githubusercontent.com/70236767/136733825-8abcdf8c-bb1a-42c2-807a-783f75a4cac8.JPG)
![슬라이드19](https://user-images.githubusercontent.com/70236767/136733827-022b292d-e18b-4f37-ab42-2af7c2d5fe35.JPG)
![슬라이드20](https://user-images.githubusercontent.com/70236767/136733828-79f3297d-3d7c-4a96-94c2-f75d097caa28.JPG)
![슬라이드21](https://user-images.githubusercontent.com/70236767/136733829-2409dcc2-d860-48d5-9df9-9c99d15544aa.JPG)
![슬라이드22](https://user-images.githubusercontent.com/70236767/136733830-86cf9f2c-5ead-4347-9927-cbdd9de782c8.JPG)
![슬라이드23](https://user-images.githubusercontent.com/70236767/136733831-5d51f866-4110-43b8-991c-033e1a009e5b.JPG)
![슬라이드24](https://user-images.githubusercontent.com/70236767/136733832-b9697669-e5b6-44e5-a498-f04de1885fe3.JPG)
![슬라이드25](https://user-images.githubusercontent.com/70236767/136733834-1cb83943-3a7a-46f1-a48b-f5b7ed64635f.JPG)
![슬라이드26](https://user-images.githubusercontent.com/70236767/136733836-8e3047af-fb76-45dc-9537-38de346525a6.JPG)
![슬라이드27](https://user-images.githubusercontent.com/70236767/136733838-11fd84ab-61b8-4db7-ad0a-de9746a115bd.JPG)
![슬라이드28](https://user-images.githubusercontent.com/70236767/136733839-79ec923a-da0a-4195-9ad8-dcaf9d521cc2.JPG)
![슬라이드29](https://user-images.githubusercontent.com/70236767/136733840-5f92716b-85d7-4073-a8ab-0d167a9abc6a.JPG)
![슬라이드30](https://user-images.githubusercontent.com/70236767/136733841-e854334c-066d-47f7-9888-19f4609feb76.JPG)
![슬라이드31](https://user-images.githubusercontent.com/70236767/136733842-6812bc2c-8c5b-464a-9427-84f13a771107.JPG)
![슬라이드32](https://user-images.githubusercontent.com/70236767/136733843-913c07a3-1030-4032-a319-bc3633b86245.JPG)
![슬라이드33](https://user-images.githubusercontent.com/70236767/136733844-474840c1-f102-4ce2-a195-0d3cbae39bb6.JPG)
![슬라이드34](https://user-images.githubusercontent.com/70236767/136733846-e48b3440-8cab-47bc-8f82-4d54b2c036ae.JPG)
![슬라이드35](https://user-images.githubusercontent.com/70236767/136733847-824e6238-299a-497b-8459-4472091cc917.JPG)
![슬라이드36](https://user-images.githubusercontent.com/70236767/136733849-4adc9f51-600c-4fea-9558-6486d9ddaf11.JPG)
![슬라이드37](https://user-images.githubusercontent.com/70236767/136733850-56fe557a-0cb1-4eab-86c5-9a4bb3584699.JPG)
![슬라이드38](https://user-images.githubusercontent.com/70236767/136733852-bd7de635-ee13-4870-832c-677330bc0ed5.JPG)
![슬라이드39](https://user-images.githubusercontent.com/70236767/136733854-1b96fdfd-f3fe-46d8-8725-4a70b4024186.JPG)
| 70.960784 | 112 | 0.810445 | yue_Hant | 0.250179 |
3bb133fffe4cd3fe1f2d143936f1d0ac5248ba4c | 512 | md | Markdown | catalog/hug-kiss-akushu/en-US_hug-kiss-akushu.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/hug-kiss-akushu/en-US_hug-kiss-akushu.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/hug-kiss-akushu/en-US_hug-kiss-akushu.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Hug Kiss Akushu
![hug-kiss-akushu](https://cdn.myanimelist.net/images/manga/3/182473.jpg)
- **type**: manga
- **volumes**: 1
- **chapters**: 6
- **original-name**: HUG キス あくしゅ
- **start-date**: 2009-09-20
## Tags
- school
- yaoi
## Authors
- Moto
- Haruhira (Story & Art)
## Sinopse
The feeling of love is difficult. That’s why I’ll try to be together with you. A teacher? A student? Their feelings?
## Links
- [My Anime list](https://myanimelist.net/manga/18413/Hug_Kiss_Akushu)
| 18.285714 | 116 | 0.644531 | eng_Latn | 0.522128 |
3bb144bf0aaa956d42ea45d33f699405602ffb38 | 691 | md | Markdown | number-pairs/README.md | redbearin/skill-work | 75d6b36893206bbd8f5a1a55bcb0418cfbd1a39d | [
"MIT"
] | 1 | 2020-01-15T02:13:13.000Z | 2020-01-15T02:13:13.000Z | number-pairs/README.md | redbearin/skill-work | 75d6b36893206bbd8f5a1a55bcb0418cfbd1a39d | [
"MIT"
] | null | null | null | number-pairs/README.md | redbearin/skill-work | 75d6b36893206bbd8f5a1a55bcb0418cfbd1a39d | [
"MIT"
] | null | null | null | Create a function that determines how many number pairs are embedded in a space-separated string. The first numeric value in the space-separated string represents the count of the numbers, thus, excluded in the pairings.
Examples
number_pairs("7 1 2 1 2 1 3 2") ➞ 2
// (1, 1), (2, 2)
number_pairs("9 10 20 20 10 10 30 50 10 20") ➞ 3
// (10, 10), (20, 20), (10, 10)
number_pairs("4 2 3 4 1") ➞ 0
// Although two 4's are present, the first one is discounted.
Notes
Always take into consideration the first number in the string is not part of the pairing, thus, the count. It may not seem so useful as most people see it, but it's mathematically significant if you deal with set operations. | 49.357143 | 224 | 0.732272 | eng_Latn | 0.999837 |
3bb1960d65bfa12e0b630a91a9dafcaadc9d0bee | 8,974 | md | Markdown | fabric/8123-8435/8353.md | hyperledger-gerrit-archive/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 2 | 2021-01-08T04:06:04.000Z | 2021-02-09T08:28:54.000Z | fabric/8123-8435/8353.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | null | null | null | fabric/8123-8435/8353.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 4 | 2019-12-07T05:54:26.000Z | 2020-06-04T02:29:43.000Z | <strong>Project</strong>: fabric<br><strong>Branch</strong>: master<br><strong>ID</strong>: 8353<br><strong>Subject</strong>: [FAB-3155] LSCC security checks at validation time<br><strong>Status</strong>: ABANDONED<br><strong>Owner</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Assignee</strong>:<br><strong>Created</strong>: 4/21/2017, 10:28:27 AM<br><strong>LastUpdated</strong>: 4/23/2017, 10:58:57 AM<br><strong>CommitMessage</strong>:<br><pre>[FAB-3155] LSCC security checks at validation time
Fix bad nil check.
Change-Id: I62fbeba1a212787602e8d8f0a0502382da4118d0
Signed-off-by: Alessandro Sorniotti <ale.linux@sopit.net>
</pre><h1>Comments</h1><strong>Reviewer</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Reviewed</strong>: 4/21/2017, 10:28:27 AM<br><strong>Message</strong>: <pre>Uploaded patch set 1.</pre><strong>Reviewer</strong>: Elli Androulaki - lli@zurich.ibm.com<br><strong>Reviewed</strong>: 4/21/2017, 12:00:00 PM<br><strong>Message</strong>: <pre>Patch Set 1: Code-Review+1</pre><strong>Reviewer</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Reviewed</strong>: 4/21/2017, 12:11:45 PM<br><strong>Message</strong>: <pre>Patch Set 1: Code-Review+2</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/21/2017, 12:24:12 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4160/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/21/2017, 12:34:12 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/10093/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/21/2017, 12:34:15 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1627/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/21/2017, 1:41:07 PM<br><strong>Message</strong>: <pre>Patch Set 1: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4160/ : FAILURE (skipped)
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1627/ : FAILURE (skipped)
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/10093/ : SUCCESS</pre><strong>Reviewer</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Reviewed</strong>: 4/21/2017, 6:30:40 PM<br><strong>Message</strong>: <pre>Patch Set 1: Code-Review+2</pre><strong>Reviewer</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Reviewed</strong>: 4/23/2017, 3:18:15 AM<br><strong>Message</strong>: <pre>Uploaded patch set 2: Patch Set 1 was rebased.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 3:22:47 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4320/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 3:22:48 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/10254/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 3:22:50 AM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1787/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 3:32:57 AM<br><strong>Message</strong>: <pre>Patch Set 2: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4320/ : FAILURE (skipped)
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/10254/ : FAILURE
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1787/ : FAILURE (skipped)</pre><strong>Reviewer</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Reviewed</strong>: 4/23/2017, 3:34:49 AM<br><strong>Message</strong>: <pre>Uploaded patch set 3: Patch Set 2 was rebased.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 3:38:53 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/10259/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 3:38:53 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4325/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 3:39:11 AM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1792/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 4/23/2017, 4:53:07 AM<br><strong>Message</strong>: <pre>Patch Set 3: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4325/ : FAILURE (skipped)
https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1792/ : FAILURE (skipped)
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/10259/ : SUCCESS</pre><strong>Reviewer</strong>: Elli Androulaki - lli@zurich.ibm.com<br><strong>Reviewed</strong>: 4/23/2017, 9:43:58 AM<br><strong>Message</strong>: <pre>Patch Set 3: Code-Review+1</pre><strong>Reviewer</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Reviewed</strong>: 4/23/2017, 10:58:57 AM<br><strong>Message</strong>: <pre>Abandoned</pre><h1>PatchSets</h1><h3>PatchSet Number: 1</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Uploader</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Created</strong>: 4/21/2017, 10:28:27 AM<br><strong>UnmergedRevision</strong>: [8c0492c45c1b58fab281c7602982aa2965ed1a8f](https://github.com/hyperledger-gerrit-archive/fabric/commit/8c0492c45c1b58fab281c7602982aa2965ed1a8f)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 4/21/2017, 1:41:07 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Approved</strong>: 4/21/2017, 6:30:40 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Approved</strong>: 4/21/2017, 12:11:45 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Elli Androulaki - lli@zurich.ibm.com<br><strong>Approved</strong>: 4/21/2017, 12:00:00 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 2</h3><blockquote><strong>Type</strong>: TRIVIAL_REBASE<br><strong>Author</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Uploader</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Created</strong>: 4/23/2017, 3:18:15 AM<br><strong>UnmergedRevision</strong>: [861bd74155cb4396a5ba5d96e10e7960b70ff8ea](https://github.com/hyperledger-gerrit-archive/fabric/commit/861bd74155cb4396a5ba5d96e10e7960b70ff8ea)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 4/23/2017, 3:32:57 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 3</h3><blockquote><strong>Type</strong>: TRIVIAL_REBASE<br><strong>Author</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Uploader</strong>: Alessandro Sorniotti - ale.linux@sopit.net<br><strong>Created</strong>: 4/23/2017, 3:34:49 AM<br><strong>UnmergedRevision</strong>: [5adf62b825d42bf5892ec2149f5510a529e6289b](https://github.com/hyperledger-gerrit-archive/fabric/commit/5adf62b825d42bf5892ec2149f5510a529e6289b)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 4/23/2017, 4:53:07 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Elli Androulaki - lli@zurich.ibm.com<br><strong>Approved</strong>: 4/23/2017, 9:43:58 AM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br></blockquote> | 183.142857 | 3,262 | 0.759639 | kor_Hang | 0.300908 |
3bb1ef204bbc046d83e04cbe10b14ca2740f8824 | 3,037 | md | Markdown | ANSWERS.md | tolokoban/sliding-puzzle-slider | ac48b9fbb053026e78c52fd5601ff75f2a7c085e | [
"MIT"
] | null | null | null | ANSWERS.md | tolokoban/sliding-puzzle-slider | ac48b9fbb053026e78c52fd5601ff75f2a7c085e | [
"MIT"
] | 1 | 2021-05-11T04:24:08.000Z | 2021-05-11T04:24:08.000Z | ANSWERS.md | tolokoban/sliding-puzzle-slider | ac48b9fbb053026e78c52fd5601ff75f2a7c085e | [
"MIT"
] | null | null | null | # Answers
## Adding DIVs
```js
function createCells() {
const grid = document.getElementById('grid')
const cells = []
for (let i = 0; i < 15; i++) {
const cell = document.createElement('div')
cell.textContent = i
grid.appendChild(cell)
cells.push(cell)
}
return cells
}
```
## Styling the grid
```css
#grid {
position: relative;
width: 128px;
height: 128px;
margin: 32px;
padding: 0;
background: #bbb;
}
#grid > div {
position: absolute;
background: #ddd;
width: 32px;
height: 32px;
margin: 0;
display: flex;
justify-content: center;
align-items: center;
border-radius: 4px;
box-shadow: 2px 2px 1px #fff inset, -1px -1px 1px #000 inset;
}
```
## Moving cells
```js
function move(cell, row, col) {
cell.style.transform = `translate(${col * 32}px,${row * 32}px)`
}
```
## Setting up
```js
function setUp(cells) {
cells.forEach((cell, k) => {
const col = k % 4;
const row = Math.floor((k - col) / 4)
move(cell, row, col)
})
}
```
## Random positions
```js
function setUp(cells) {
const mapping = shuffle([0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15])
cells.forEach((cell, idx) => {
const k = mapping[idx]
const col = k % 4;
const row = Math.floor((k - col) / 4)
move(cell, row, col)
})
}
function shuffle(arr) {
for (let i = 0; i < arr.length ; i++) {
const k = Math.floor(Math.random() * arr.length)
const tmp = arr[i]
arr[i] = arr[k]
arr[k] = tmp
}
return arr
}
```
## Advanced styling
```css
#grid {
border: 2px solid #864;
border-radius: 4px;
}
#grid::before {
z-index: -1;
content: ".";
font-size: 0;
position: absolute;
border-radius: 24px;
width: 176px;
height: 176px;
margin: -24px;
padding: 0;
background: linear-gradient(to top,#cba,#420);
box-shadow: 0 0 1px 3px inset #8647, 0 8px 16px #000b, 0 0px 8px 4px #321e;
}
#grid::after {
z-index: 1;
content: ".";
font-size: 0;
position: absolute;
width: 376px;
margin: 0;
height: 176px;
transform: translate(-176px,-100px);
border-radius: 100%;
background: #fff6;
clip-path: path('M176,76 h128 q24,0,24,24 v100 h-176 v-100 q0,-24,24,-24 Z');
}
/* Flex */
body {
overflow: hidden;
background: #def;
}
div.flex {
margin: 0;
position: absolute;
width: 100%;
height: 100%;
display: flex;
flex-direction: column;
justify-content: space-around;
align-items: center;
transition: opacity .5s, filter 1s, transform .5s;
opacity: 1;
filter: blur(0);
transform: scale(1);
}
div.flex.hide {
opacity: 0;
filter: blur(10px);
transform: scale(.75);
}
```
## Promise
```js
function loadImage(url) {
return new Promise((resolve, reject) => {
const img = new Image()
img.src = url
img.onload = () => resolve(img)
img.onerror = reject
})
}
```
```js
const logoLoader = loadImage(LOGO)
const gridLoader = loadImage(LOGO)
Promise.all([logoLoader, gridLoader]).then(start)
```
## Solving the game
You can use the A-* algorithm with the manhattan distance as an heuristic.
| 17.158192 | 79 | 0.614422 | eng_Latn | 0.326321 |
3bb2286529931b01390f97d0287fcc396caa412a | 98 | md | Markdown | README.md | AnyMoeNFT/website | 7167ebf9ff81293d46c444f0023cb814eeadcfd0 | [
"Apache-2.0"
] | null | null | null | README.md | AnyMoeNFT/website | 7167ebf9ff81293d46c444f0023cb814eeadcfd0 | [
"Apache-2.0"
] | null | null | null | README.md | AnyMoeNFT/website | 7167ebf9ff81293d46c444f0023cb814eeadcfd0 | [
"Apache-2.0"
] | null | null | null | # AnyMoe Website
Here's the source code of our website, created by Vite.
## License
Apache 2.0
| 12.25 | 55 | 0.72449 | eng_Latn | 0.930756 |
3bb3626b1e2b3098568d0274fca50d0cd17d19e3 | 503 | md | Markdown | _posts/2017-01-29-Vera-Wang-Olympia.md | HOLEIN/HOLEIN.github.io | 7da00c82f070f731cb05c3799426f481aba19b99 | [
"MIT"
] | null | null | null | _posts/2017-01-29-Vera-Wang-Olympia.md | HOLEIN/HOLEIN.github.io | 7da00c82f070f731cb05c3799426f481aba19b99 | [
"MIT"
] | null | null | null | _posts/2017-01-29-Vera-Wang-Olympia.md | HOLEIN/HOLEIN.github.io | 7da00c82f070f731cb05c3799426f481aba19b99 | [
"MIT"
] | null | null | null | ---
layout: post
date: 2017-01-29
title: "Vera Wang Olympia"
category: Vera Wang
tags: [Vera Wang]
---
### Vera Wang Olympia
Just **$329.99**
###
<a href="https://www.readybrides.com/en/vera-wang/4102-vera-wang-olympia.html"><img src="//img.readybrides.com/8180/vera-wang-olympia.jpg" alt="Vera Wang Olympia" style="width:100%;" /></a>
<!-- break -->
Buy it: [https://www.readybrides.com/en/vera-wang/4102-vera-wang-olympia.html](https://www.readybrides.com/en/vera-wang/4102-vera-wang-olympia.html)
| 31.4375 | 189 | 0.697813 | yue_Hant | 0.413149 |
3bb64f8a745f22198cee5009218be68391eeffee | 451 | md | Markdown | calc/readme.md | PiroDev/golang-utils | 333a39153eb7108d48cd7cfcb1a68583c8bea1d2 | [
"Apache-2.0"
] | null | null | null | calc/readme.md | PiroDev/golang-utils | 333a39153eb7108d48cd7cfcb1a68583c8bea1d2 | [
"Apache-2.0"
] | null | null | null | calc/readme.md | PiroDev/golang-utils | 333a39153eb7108d48cd7cfcb1a68583c8bea1d2 | [
"Apache-2.0"
] | null | null | null | # Calc
Simple calculator utility. Supports simple operations (**+ - * /**).
## Requirements
>To use this utility your need to [install Go](https://golang.org/doc/install)
## Usage
`go run main.go "expression"`
Expression can contains:
1. Float numbers (doesn't support unary operations like unary minus or plus (-num, +num))
2. Operations: **+ - * /**
3. Brackets: **( )**
Example expression:
```math
2 * ((3 + 5) * 10 - 35 / 23) * 33 + 12
``` | 20.5 | 89 | 0.643016 | eng_Latn | 0.849364 |
3bb723057748e1e3655ef79f2a67a7f9a7c85ce7 | 12,534 | md | Markdown | articles/notification-hubs/notification-hubs-kindle-amazon-adm-push-notification.md | OpenLocalizationTestOrg/azure-docs-pr15_hr-HR | 94470f6d3849fb1d48d443d49ffe0217ddba2f80 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-kindle-amazon-adm-push-notification.md | OpenLocalizationTestOrg/azure-docs-pr15_hr-HR | 94470f6d3849fb1d48d443d49ffe0217ddba2f80 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-kindle-amazon-adm-push-notification.md | OpenLocalizationTestOrg/azure-docs-pr15_hr-HR | 94470f6d3849fb1d48d443d49ffe0217ddba2f80 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | <properties
pageTitle="Početak rada s Azure obavijesti koncentratora za aplikacije na uređaju Kindle | Microsoft Azure"
description="U ovom ćete praktičnom vodiču saznati kako koristiti koncentratora Azure obavijesti da biste poslali automatske obavijesti aplikacija na uređaju Kindle."
services="notification-hubs"
documentationCenter=""
authors="ysxu"
manager="erikre"
editor=""/>
<tags
ms.service="notification-hubs"
ms.workload="mobile"
ms.tgt_pltfrm="mobile-kindle"
ms.devlang="Java"
ms.topic="hero-article"
ms.date="06/29/2016"
ms.author="yuaxu"/>
# <a name="get-started-with-notification-hubs-for-kindle-apps"></a>Početak rada s obavijesti koncentratora za aplikacije na uređaju Kindle
[AZURE.INCLUDE [notification-hubs-selector-get-started](../../includes/notification-hubs-selector-get-started.md)]
##<a name="overview"></a>Pregled
Pomoću ovog praktičnog vodiča pokazuje kako koristiti koncentratora Azure obavijesti da biste poslali automatske obavijesti aplikacija na uređaju Kindle.
Stvorit ćete prilagođenom uređaju Kindle web-aplikacijom koja prima slanje obavijesti putem razmjene poruka Amazon uređaj (Admin).
##<a name="prerequisites"></a>Preduvjeti
Pomoću ovog praktičnog vodiča potrebno je sljedeće:
+ Pronađite Android SDK (smo pretpostavlja da će koristiti Eclipse) s <a href="http://go.microsoft.com/fwlink/?LinkId=389797">web-mjesta sa sustavom Android</a>.
+ Slijedite korake u <a href="https://developer.amazon.com/appsandservices/resources/development-tools/ide-tools/tech-docs/01-setting-up-your-development-environment">Postavku kopije vaše okruženje za razvoj</a> da biste postavili razvojno okruženje za uređaju Kindle.
##<a name="add-a-new-app-to-the-developer-portal"></a>Dodavanje nove aplikacije portala za razvojne inženjere
1. Prvo, stvorite aplikaciju [Amazon portala za razvojne inženjere].
![][0]
2. Kopirajte **Aplikacijsku tipku**.
![][1]
3. Na portalu za kliknite naziv aplikacije, a zatim na kartici **Poruka uređaja** .
![][2]
4. Kliknite **Stvori novi profil za sigurnost**, a zatim stvorite novi profil sigurnosti (na primjer, **TestAdm sigurnosni profil**). Zatim kliknite **Spremi**.
![][3]
5. Kliknite **Sigurnosnih profila** da biste pogledali sigurnosni profil koji ste upravo stvorili. Kopirajte vrijednosti **ID klijenta** i **Tajna klijenta** za kasnije korištenje.
![][4]
## <a name="create-an-api-key"></a>Stvoriti ključ za API-JA
1. Otvorite naredbeni redak s administratorskim ovlastima.
2. Dođite do mape u Android SDK.
3. Unesite sljedeću naredbu:
keytool -list -v -alias androiddebugkey -keystore ./debug.keystore
![][5]
4. Lozinka **keystore** upišite **sa sustavom android**.
5. Kopirajte **MD5** prstiju.
6. Vratite se u portala za razvojne inženjere, na kartici **razmjena poruka** kliknite **Android/uređaju Kindle** unesite naziv paketa aplikacije (na primjer, **com.sample.notificationhubtest**) i **MD5** vrijednost i kliknite **Stvori ključ za API -JA**.
## <a name="add-credentials-to-the-hub"></a>Dodavanje vjerodajnice u središtu
Na portalu dodajte tajna klijenta i ID klijenta na karticu **Konfiguracija** koncentratora za obavijesti.
## <a name="set-up-your-application"></a>Postavljanje aplikacije
> [AZURE.NOTE] Kada stvarate aplikaciju, koristite najmanje 17 razinu API-JA.
Dodavanje biblioteka Admin Eclipse projekta:
1. Da biste dobili Admin biblioteke, [Preuzmite SDK]. Izdvajanje SDK zip datoteku.
2. U Eclipse, desnom tipkom miša kliknite projekt, a zatim kliknite **Svojstva**. Odaberite **Put sastavljanje Java** na lijevoj strani, a zatim na kartici **Biblioteka **na vrhu. Kliknite **Dodavanje vanjskog posudu**i odaberite datoteku `\SDK\Android\DeviceMessaging\lib\amazon-device-messaging-*.jar` iz imenika dobivenih Amazon SDK.
3. Preuzmite Android SDK NotificationHubs (veza).
4. Raspakiraj paket, a zatim povucite datoteku `notification-hubs-sdk.jar` u na `libs` mape u Eclipse.
Uređivanje vaše manifest aplikacije za podršku Admin:
1. Dodavanje prostora za naziv Amazon u korijenskom elementu manifesta:
xmlns:amazon="http://schemas.amazon.com/apk/res/android"
2. Dodajte dozvole kao prvi element u odjeljku manifesta element. **[Naziv vaše PAKETA]** zamijeniti paket koji ste koristili za stvaranje aplikacije.
<permission
android:name="[YOUR PACKAGE NAME].permission.RECEIVE_ADM_MESSAGE"
android:protectionLevel="signature" />
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="[YOUR PACKAGE NAME].permission.RECEIVE_ADM_MESSAGE" />
<!-- This permission allows your app access to receive push notifications
from ADM. -->
<uses-permission android:name="com.amazon.device.messaging.permission.RECEIVE" />
<!-- ADM uses WAKE_LOCK to keep the processor from sleeping when a message is received. -->
<uses-permission android:name="android.permission.WAKE_LOCK" />
3. Umetnite sljedeći element kao prvi podređeni element aplikacije. Ne zaboravite zamijeniti **[Naziv vaše usluge]** s nazivom svoje rukovatelj Admin poruke koje ste stvorili u sljedećem odjeljku (uključujući paketa), a **[Naziv vaše PAKETA]** zamijenite naziv paketa koji ste stvorili aplikacije.
<amazon:enable-feature
android:name="com.amazon.device.messaging"
android:required="true"/>
<service
android:name="[YOUR SERVICE NAME]"
android:exported="false" />
<receiver
android:name="[YOUR SERVICE NAME]$Receiver" />
<!-- This permission ensures that only ADM can send your app registration broadcasts. -->
android:permission="com.amazon.device.messaging.permission.SEND" >
<!-- To interact with ADM, your app must listen for the following intents. -->
<intent-filter>
<action android:name="com.amazon.device.messaging.intent.REGISTRATION" />
<action android:name="com.amazon.device.messaging.intent.RECEIVE" />
<!-- Replace the name in the category tag with your app's package name. -->
<category android:name="[YOUR PACKAGE NAME]" />
</intent-filter>
</receiver>
## <a name="create-your-adm-message-handler"></a>Stvaranje vaše rukovatelj Admin poruke
1. Stvaranje nove predmete koji nasljeđuju od `com.amazon.device.messaging.ADMMessageHandlerBase` i nazovite ih `MyADMMessageHandler`, kao što je prikazano na sljedećoj slici:
![][6]
2. Dodajte sljedeće `import` izvješća:
import android.app.NotificationManager;
import android.app.PendingIntent;
import android.content.Context;
import android.content.Intent;
import android.support.v4.app.NotificationCompat;
import com.amazon.device.messaging.ADMMessageReceiver;
import com.microsoft.windowsazure.messaging.NotificationHub
3. Dodajte sljedeći kod u razredu koji ste stvorili. Imajte na umu zamjenu koncentrator naziv i veze niza (preslušavanja):
public static final int NOTIFICATION_ID = 1;
private NotificationManager mNotificationManager;
NotificationCompat.Builder builder;
private static NotificationHub hub;
public static NotificationHub getNotificationHub(Context context) {
Log.v("com.wa.hellokindlefire", "getNotificationHub");
if (hub == null) {
hub = new NotificationHub("[hub name]", "[listen connection string]", context);
}
return hub;
}
public MyADMMessageHandler() {
super("MyADMMessageHandler");
}
public static class Receiver extends ADMMessageReceiver
{
public Receiver()
{
super(MyADMMessageHandler.class);
}
}
private void sendNotification(String msg) {
Context ctx = getApplicationContext();
mNotificationManager = (NotificationManager)
ctx.getSystemService(Context.NOTIFICATION_SERVICE);
PendingIntent contentIntent = PendingIntent.getActivity(ctx, 0,
new Intent(ctx, MainActivity.class), 0);
NotificationCompat.Builder mBuilder =
new NotificationCompat.Builder(ctx)
.setSmallIcon(R.mipmap.ic_launcher)
.setContentTitle("Notification Hub Demo")
.setStyle(new NotificationCompat.BigTextStyle()
.bigText(msg))
.setContentText(msg);
mBuilder.setContentIntent(contentIntent);
mNotificationManager.notify(NOTIFICATION_ID, mBuilder.build());
}
4. Dodati sljedeći kod da biste na `OnMessage()` metoda:
String nhMessage = intent.getExtras().getString("msg");
sendNotification(nhMessage);
5. Dodati sljedeći kod da biste na `OnRegistered` metoda:
try {
getNotificationHub(getApplicationContext()).register(registrationId);
} catch (Exception e) {
Log.e("[your package name]", "Fail onRegister: " + e.getMessage(), e);
}
6. Dodati sljedeći kod da biste na `OnUnregistered` metoda:
try {
getNotificationHub(getApplicationContext()).unregister();
} catch (Exception e) {
Log.e("[your package name]", "Fail onUnregister: " + e.getMessage(), e);
}
7. U na `MainActivity` metoda dodajte sljedeća naredba uvoz:
import com.amazon.device.messaging.ADM;
8. Dodati sljedeći kod na kraju na `OnCreate` metoda:
final ADM adm = new ADM(this);
if (adm.getRegistrationId() == null)
{
adm.startRegister();
} else {
new AsyncTask() {
@Override
protected Object doInBackground(Object... params) {
try { MyADMMessageHandler.getNotificationHub(getApplicationContext()).register(adm.getRegistrationId());
} catch (Exception e) {
Log.e("com.wa.hellokindlefire", "Failed registration with hub", e);
return e;
}
return null;
}
}.execute(null, null, null);
}
## <a name="add-your-api-key-to-your-app"></a>Dodajte ključ za API na aplikaciju
1. U Eclipse, stvorite novu datoteku pod nazivom **api_key.txt** u imenik imovine projekta.
2. Otvorite datoteku i kopirajte ključ za API koji je generirao na portalu za razvojne inženjere Amazon.
## <a name="run-the-app"></a>Pokrenite aplikaciju
1. Započnite s emulator.
2. U emulator, prijeđite prstom od vrha i kliknite **Postavke**, zatim kliknite **moj račun** i morate registrirati valjan račun za Amazon.
3. U Eclipse, pokrenite aplikaciju.
> [AZURE.NOTE] Ako se pojavi problem, provjerite vrijeme emulator (ili uređaj). Vremensku vrijednost mora biti točan. Da biste promijenili vrijeme na uređaju Kindle emulator, iz imenika Alati za platforme Android SDK možete pokrenite sljedeću naredbu:
adb shell date -s "yyyymmdd.hhmmss"
## <a name="send-a-message"></a>Slanje poruke
Da biste poslali poruku pomoću .NET:
static void Main(string[] args)
{
NotificationHubClient hub = NotificationHubClient.CreateClientFromConnectionString("[conn string]", "[hub name]");
hub.SendAdmNativeNotificationAsync("{\"data\":{\"msg\" : \"Hello from .NET!\"}}").Wait();
}
![][7]
<!-- URLs. -->
[Amazon portala za razvojne inženjere]: https://developer.amazon.com/home.html
[Preuzmite SDK]: https://developer.amazon.com/public/resources/development-tools/sdk
[0]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-portal1.png
[1]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-portal2.png
[2]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-portal3.png
[3]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-portal4.png
[4]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-portal5.png
[5]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-cmd-window.png
[6]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-new-java-class.png
[7]: ./media/notification-hubs-kindle-get-started/notification-hub-kindle-notification.png
| 44.133803 | 336 | 0.686772 | hrv_Latn | 0.622925 |
3bb8bf34c43a0b841ebf613554933a85bd0d2d4b | 4,661 | md | Markdown | articles/app-service-mobile/app-service-mobile-how-to-configure-microsoft-authentication.md | vipulverulkar/Azure-contecnt | 2c7bc80617d1af0bdb9c3a3f264b6b707d722d0c | [
"CC-BY-3.0"
] | 5 | 2019-12-09T16:03:23.000Z | 2020-02-22T23:57:12.000Z | articles/app-service-mobile/app-service-mobile-how-to-configure-microsoft-authentication.md | vipulverulkar/Azure-contecnt | 2c7bc80617d1af0bdb9c3a3f264b6b707d722d0c | [
"CC-BY-3.0"
] | null | null | null | articles/app-service-mobile/app-service-mobile-how-to-configure-microsoft-authentication.md | vipulverulkar/Azure-contecnt | 2c7bc80617d1af0bdb9c3a3f264b6b707d722d0c | [
"CC-BY-3.0"
] | null | null | null | <properties
pageTitle="How to configure Microsoft Account authentication for your App Services application"
description="Learn how to configure Microsoft Account authentication for your App Services application."
authors="mattchenderson"
services="app-service\mobile"
documentationCenter=""
manager="dwrede"
editor=""/>
<tags
ms.service="app-service-mobile"
ms.workload="mobile"
ms.tgt_pltfrm="na"
ms.devlang="multiple"
ms.topic="article"
ms.date="11/20/2015"
ms.author="mahender"/>
# How to configure your App Service application to use Microsoft Account login
[AZURE.INCLUDE [app-service-mobile-selector-authentication](../../includes/app-service-mobile-selector-authentication.md)]
[AZURE.INCLUDE [app-service-mobile-note-mobile-services](../../includes/app-service-mobile-note-mobile-services.md)]
This topic shows you how to configure Azure App Service to use Microsoft Account as an authentication provider.
> [AZURE.NOTE]
This topic demonstrates use of the App Service Authentication / Authorization feature. This replaces the App Service gateway for most applications. Differences that apply to using the gateway are called out in notes throughout the topic.
## <a name="register"> </a>Register your application with Microsoft Account
1. Log on to the [Azure portal], and navigate to your application. Copy your **URL**. You will use this to configure your Microsoft Account app.
2. Navigate to the [My Applications] page in the Microsoft Account Developer Center, and log on with your Microsoft account, if required.
4. Click **Create application**, then type an **Application name** and click **I accept**.
5. Click **API Settings**. Select **Yes** for **Mobile or desktop client app**. In the **Redirect URL** field, enter your application's **Redirect URL** and click **Save**. Your redirect URI is the URL of your application appended with the path, _/.auth/login/microsoftaccount/callback_. For example, `https://contoso.azurewebsites.net/.auth/login/microsoftaccount/callback`. Make sure that you are using the HTTPS scheme.
![][0]
> [AZURE.NOTE]
If you are using the App Service Gateway instead of the App Service Authentication / Authorization feature, your redirect URL instead uses the gateway URL with the _/signin-microsoft_ path.
6. Click **App Settings** and make a note of the values of the **Client ID** and **Client secret**.
> [AZURE.NOTE] The client secret is an important security credential. Do not share the client secret with anyone or distribute it within a client application.
## <a name="secrets"> </a>Add Microsoft Account information to your application
> [AZURE.NOTE]
If using the App Service Gateway, ignore this section and instead navigate to your gateway in the portal. Select **Settings**, **Identity**, and then **Microsoft Account**. Paste in the values you obtained earlier and click **Save**.
7. Back in the [Azure portal], navigate to your application. Click **Settings**, and then **Authentication / Authorization**.
8. If the Authentication / Authorization feature is not enabled, turn the switch to **On**.
9. Click **Microsoft Account**. Paste in the App ID and App Secret values which you obtained previously, and optionally enable any scopes your application requires. Then click **OK**.
![][1]
By default, App Service provides authentication but does not restrict authorized access to your site content and APIs. You must authorize users in your app code.
17. (Optional) To restrict access to your site to only users authenticated by Microsoft account, set **Action to take when request is not authenticated** to **Microsoft Account**. This requires that all requests be authenticated, and all unauthenticated requests are redirected to Microsoft account for authentication.
11. Click **Save**.
You are now ready to use Microsoft Account for authentication in your app.
## <a name="related-content"> </a>Related Content
[AZURE.INCLUDE [app-service-mobile-related-content-get-started-users](../../includes/app-service-mobile-related-content-get-started-users.md)]
<!-- Authenticate your app with Live Connect Single Sign-On: [Windows](windows-liveconnect) -->
<!-- Images. -->
[0]: ./media/app-service-mobile-how-to-configure-microsoft-authentication/app-service-microsoftaccount-redirect.png
[1]: ./media/app-service-mobile-how-to-configure-microsoft-authentication/mobile-app-microsoftaccount-settings.png
<!-- URLs. -->
[My Applications]: http://go.microsoft.com/fwlink/p/?LinkId=262039
[Azure portal]: https://portal.azure.com/
| 48.552083 | 423 | 0.745977 | eng_Latn | 0.970894 |
3bba9e05c6930eb93f7bea4f27e520f9694b039c | 52 | md | Markdown | README.md | meez/air-ant-support | f612785e8ab7e50f20a98deda35e989f3f10e83a | [
"MIT"
] | null | null | null | README.md | meez/air-ant-support | f612785e8ab7e50f20a98deda35e989f3f10e83a | [
"MIT"
] | null | null | null | README.md | meez/air-ant-support | f612785e8ab7e50f20a98deda35e989f3f10e83a | [
"MIT"
] | null | null | null | # air-ant-support
ANT Scripts for building AIR Apps
| 17.333333 | 33 | 0.788462 | eng_Latn | 0.890795 |
3bbb9a48a9733c6cb0a54d5c91c63722b2603baa | 104 | md | Markdown | bylaws.md | NextGen-Coop/ngc-website | b809b4d8f002d13c430655dc0b2bac027984d937 | [
"Apache-2.0"
] | 1 | 2020-12-31T12:46:09.000Z | 2020-12-31T12:46:09.000Z | bylaws.md | NextGen-Coop/ngc-website | b809b4d8f002d13c430655dc0b2bac027984d937 | [
"Apache-2.0"
] | 2 | 2020-12-31T06:06:59.000Z | 2021-01-12T09:02:05.000Z | bylaws.md | NextGen-Coop/ngc-website | b809b4d8f002d13c430655dc0b2bac027984d937 | [
"Apache-2.0"
] | null | null | null | ---
layout: page
title: Bylaws
permalink: /bylaws/
menu: true
---
This page is under-construction.gif.
| 11.555556 | 36 | 0.711538 | eng_Latn | 0.976326 |
3bbcdec9125695bfe0295a754168eea2c8472af7 | 2,876 | md | Markdown | 2021/materials/taiwan/day1_activity_intro.md | meghaarora42/summer-institute | 396a93ee5e999c3be5c4c212953eb8ddfd8ae7cd | [
"MIT"
] | 264 | 2017-02-01T16:50:02.000Z | 2022-03-30T20:36:20.000Z | 2021/materials/taiwan/day1_activity_intro.md | meghaarora42/summer-institute | 396a93ee5e999c3be5c4c212953eb8ddfd8ae7cd | [
"MIT"
] | 103 | 2017-03-31T19:32:05.000Z | 2022-02-26T03:24:55.000Z | 2021/materials/taiwan/day1_activity_intro.md | meghaarora42/summer-institute | 396a93ee5e999c3be5c4c212953eb8ddfd8ae7cd | [
"MIT"
] | 199 | 2017-06-19T15:04:00.000Z | 2022-03-18T13:21:14.000Z | # Small group discussion guide
## Summer Institutes in Computational Social Science 2021
## Day 1, Small group discussions
## Prepared by Robin Lee
### Summary
In this activity, participants will split into small groups and discuss computational social science and SICSS.
### Learning objectives
- Participants will get a chance to get to know each other on the first day.
- Participants will develop their own views on computational social science as a field.
- Participants will get to recap their learnings regarding introduction materials.
### Participant preparation
There is no participant preparation before this activity.
### Activity
- We will split you into small groups of about 4 people.
- Introduce yourself to each other and discuss one topic from session 1 the group is most interested in (about 15 minutes)
- Full group discussion on session 1 (about 10 minutes).
- Discuss one topic from session 2 (about 15 minutes)
- Full group discussion on session 2 (about 10 minutes).
### Session 1 Topics
1. One of the videos you watched before participating featured Matt answering the question "What is computational social science?". But, that's just his view. What do you think is computational social science? What are the things that are most exciting about computational social science to you? Are there specific computational social science studies that you find particularly beautiful or inspiring?
2. How accessible do you think Computational Social Science is? What are potential ways for you to expand the audience? Please brainstorm at least 2 potential initiatives.
3. What are some examples of high quality training in either CSS or another interdisciplinary topic that you have participated in? What are the features of those training materials and sessions? Share with your small group members.
### Session 2 Topics
1. Using mobile phone call records as a data source, please come up with at least 5 potential research questions that it might serve as a useful alternative/complementary dataset. What topics might we answer using this type of data?
2. Let’s say you’re teaching a course for college students, how would you explain the research done by Blumenstock? Please be creative in using visual displays (for example slides, memes, 懶人包).
3. As explained in both video and textbook, doing gold-standard health and demographic surveys is expensive and infrequent. Please come up with 3 existing research-grade, custom-made dataset where cost is a problem and ready-made data using computational methods might serve as a decent proxy?
### Notetaker/facilitator
In session 1, the person with the most strokes in their Chinese name will be the notetaker, while the person with the least strokes will facilitate the conversation.
In session 2, the persons who have not been notetakers or facilitators should take on the roles.
| 56.392157 | 405 | 0.796592 | eng_Latn | 0.999607 |
3bbe86bc00b7543962d472a2000eae14cce7be49 | 10,774 | md | Markdown | FluentPython.md | MachPro/BookNotes | 81eb70464c2990c9c0462046fff74ea4d7af4259 | [
"Apache-2.0"
] | null | null | null | FluentPython.md | MachPro/BookNotes | 81eb70464c2990c9c0462046fff74ea4d7af4259 | [
"Apache-2.0"
] | null | null | null | FluentPython.md | MachPro/BookNotes | 81eb70464c2990c9c0462046fff74ea4d7af4259 | [
"Apache-2.0"
] | null | null | null | # Fluent Python
## Python Data Model
Python intepreter uses special methods to perform special operation, such as iteration, operator overloading, object creation and destruction.
```python
# get item by index
def __getitem__(position)
# create object
def __init__()
# overload add operator
def __add__(other)
```
#### String Representation
```python
# repr should match the source code to create the object being represented
def __repr__()
# str should return a human readable string to user
def __str__()
```
**\_\_repr\_\_** will be called if no **\_\_str\_\_** exists
#### Boolean Value of Object
```python
def __bool__()
def __len__()
```
if **\_\_bool\_\_** not implemented, **\_\_len\_\_** will be called, if that returns zero, *bool* returns *False*
## Sequences
#### Overview of built-in sequences
We can group all the built-in sequences data structures together based on two standards.
The first standard is complexity.
* Container Sequences (Complex sequences)
list, tuple, collections.deque
* Flat Sequences (Simple sequences)
str, bytes, bytearray, memoryview, array.array
The second standard is immutability.
* Immutable sequences
tuple, str, bytes
* Mutable sequences
list, bytearray, array.array, collections.deque, memoryview
#### List Comprehension & Generator Expression
Listcomp: operations to build a new list
Generator expression: similar to listcomp, but can be used to create all sequences
```python
symbol = 'abcde'
# list comp, enclosed in brackets
cs = [i for i in symbol]
# generator expr, enclosed in parentheses
tuple(i for i in symbol)
```
#### Tuple
##### Tuple Unpacking
Tuple unpacking can be used for parallel assignment, variables swap.
```python
a, b = (1, 2)
b, a = a, b
```
Use start (\*) in tuple unpacking
```python
def add(*args):
sum = 0
for i in args:
sum += i
t = (1,2,3)
add(*t)
# use * to store excess items
a, *b = t
```
##### Named Tuple
*namedtuple* is a factory that produces subclasses of *tuple* with field names and a class name
```python
from collections import namedtuple
Board = namedtuple('Board', ['name', 'price'])
catan = Board(name='Catan', price=10)
data = ('Carcarssone', 10)
# instantiate a named tuple from iterable
Board._make(data)
# return ordered dict
catan._asdict()
```
#### Slice and Ellipsis
The notation a : b : c is only valid within [] when used as index or subscript, and it produces a slice object slice(a,b,c)
... is a shortcut for the Ellipsis object
```python
# if x is a four-dimensional list
# x[i, ...] is equal to x[i, :, :, :]
```
##### Assignment to slices
```python
l = list(range(10))
# slice can be used to remove and insert elements
l[2:6] = [11,12]
# now l is [0, 1, 11, 12, 6, 7, 8 ,9]
del l[7:8]
# now l is [0, 1, 11, 12, 6, 7, 8]
```
#### Operators on Sequences
\+ and \* operator on sequences will create a new sequence
```python
a = [1, 2]
b = a * 3
# b is [1, 2, 1, 2, 1, 2]
```
For expression like **a \* n**, if a is sequence containing mutable items, then the result will hold reference to the same **a** for n times, any change to the original object will induce change to the result
```python
a = [['_'] * 2] * 2
# now a is [['_', '_'], ['_', '_']]
a[0][1] = '*'
# now a becomes [['_', '*'], ['_', '*']]
# True
a[0] is a[1]
```
Using list comp will help do the right thing
```python
a = [['_' * 2] for _ in range(2)]
a[0][1] = '*'
# now a is [['_', '*'], ['_', '_']], since it has two different inner reference
```
#### Augmented Assignment on Sequences
Special method for operator += is \_\_iadd\_\_. If \_\_iadd\_\_ is not implemented, Python will use \_\_add\_\_ instead.
For immutable object, the \_\_iadd\_\_ method is not defined. Its \_\_add\_\_ function will be invoked, and a new object will be created.
For mutable object, if the \_\_iadd\_\_ method is defined as normal, the value of it will be changed, but the identity remains the same.
```python
l = [1, 2]
id1 = id(l)
l += [3, 4]
id2 = id(l)
# True
id1 == id2
t = (1, 2)
id1 = id(t)
t += (3, 4)
id2 = id(t)
# False
id1 == id2
```
#### Sort
*list.sort* will sort the lis in place and return None.
*sorted* will copy the original list and return it.
#### Search with bisect
bisect and bisect_right will find the last index the target can be inserted into and return the index.
bisect_left will find the first index the target can be inserted into and return the index
```python
import bisect
l = [1, 2, 4, 4, 5, 7]
target = 4
# 4
bisect.bisect(l, target)
# 2
bisect.bisect_left(l, target)
```
insort can find the index and insert target into the list.
#### Other Sequences
##### Array
When only used to store numbers, array is more efficient than list.
Array provides fast loading and saving data function such as *frombytes* and *tofile*
##### Memory View
Memory view is inspired by NumPy, it will change the way multiple bytes are read or written as units
##### Deque
**Thread-safe** double-ended queue.
## Dictionary and Set
#### Dict Comprehension
A *dictcomp* is similar to *listcomp*, which creates a *dict* instance by producing key:value pair from any iterable
```python
ITEM_MAPPING = [(1, "Catan"), (2, "Carcassone"), (3, "Pandemic")]
d = {item_id: name, for item_id, name in ITEM_MAPPING}
```
#### Update dict on missing key
```python
d = {}
d.setdefault(key, 1)
# it is equal to the following lines
if key is not in d:
d[key] = 1
```
#### Lookup dict on missing key
##### defaultdict
*defaultdict* will use a factory method to create an empty object for missing key and return it when *\_\_getitem\_\_* is invoked
```python
import collections
# use list as factory method, empty list will be created when no keys found
dd = collections.defaultdict(list)
# output: empty list
dd[1]
# Only when __getitem__ being invoked (get item by []), it will create empty list
# Otherwise, None will be returned
# True
dd.get(2) is None
```
##### _\_missing\_\_
if *_\_missing\_\_* is implemented, then *\_\_getitem\_\_* will call it whenever a key is not found, instead of raising *KeyError*
```python
class NoMissingDict(dict):
def __missing__(self, key):
print("cannot find key", key)
nmd = NoMissingDict()
nmd[1]
```
#### Variations of dict
##### OrderedDict
Keys in insertion order
##### ChainMap
A list of mappings
##### Counter
Mapping holds an integer for each key, update to an existing key will increase to its count
#### UserDict
Designed as a base class to be implemented by user
*UserDict* does not inherit fropm *dict*, but it holds an internal *dict* reference called *data*
```python
import collections
class MyDict(collections.UserDict):
def __contains__(self, key):
return str(key) in self.data
def __setitem__(self, key, item):
self.data[str(key)] = item
```
#### Immutable Dict
*MappingProxyType*, a proxy for the underlying mapping, we cannot change the proxy directly, but we can modify the original mapping, and the change can be seen through the proxy
```python
from types import MappingProxyType
d = {1: 'A'}
d_proxy = MappingProxyType(d)
# d_proxy[1] = 'A'
# we cannot change the proxy directly like the following line
# d_proxy[2] = 'B'
# but we can modify the underlying mapping
d[2] = 'B'
# now d_proxy[2] = 'B'
```
#### Set
```python
# create set using {1}, {1, 2}
s = {1}
# cannot create empty set using {}, that represents dict
# we need to write set()
s = set()
```
##### Set Operation
```python
s1 = {1, 2, 3, 4, 5}
s2 = {1, 3}
e = 1
# intersection
s1 & s2
# union
s1 | s2
# difference
s1 - s2
# xor
s1 ^ s2
# predicate operation
s1 <= s2
e in s1
```
Keys must be hashable object in set and dict.
*dict* and *set* has high memory overhead, but is very efficient for search.
#### Hashable
A hashable object must satisfy:
- support *hash()* , the return value must be the same during the lifetime of the object
- support *eq()*
- if *a*==*b* is *True*, then *hash(a)* == *hash(b)* must be *True*
Flat data types like *str*, *bytes*, *int* are all hashable
*frozenset* is also hashable because all its elements must hashable by definition
*tuple*, even it is immutale, can only be hashable when all its elements are hashable
## Function
#### Treat Function as Object
We can create function at runtime, assign it to variable, pass it as argument as normal object.
```python
def square(n):
return n * n
sq = square
sq(2)
sq.__doc__
type(sq)
```
#### High-Order Function
High-order functions are functions take function as arguments or return functions as results
```python
# sorted is a high0-order function
data = ['a', 'b', 'c', 'aa']
# use length as sorting matrix, here len is a function return the length
sorted(data, key=len)
def reverse(word): return word[::-1]
sorted(data, key=reverse)
```
##### map, filter, reduce
map, filter can be replaced by list comp and generator expression
```python
list(map(square, range(4)))
[square(i) for i in range(4)]
list(map(square, filter(lambda i: i % 2, range(4))))
[square(i) for i in range(4) if i % 2]
```
common examples implementing reduce
```python
sum(range(10))
all([True, False, True])
any([True, False, True])
```
#### Anonymous Function
We use lambda to create anoymous function in Python
#### Callable Objects
* User defined functions: created with *def*
* Built-in functions: len
* Built-in methods: dict.get()
* Methods: defined in class
* Classes: class can be used as function to create new instance
* Class instances: if implemented \_\_call\_\_ method in class, instances can be used as functions
```python
class CallableClass:
def __init__(self, state):
self._state = state
def update(self):
self._state += 1
def __call__(self):
return self.update()
clz = CallableClass()
clz()
```
* Generator functions: functions which uses the yield keyword
#### Function Introspection
Functions, like normal objects, have multiple attributes
```python
square.__name__
square.__code__
square.__annotation__
square.__doc__
```
#### Positional and Keyword-only Arguments
```python
def function(position, *list_params, keyword_only, **dict_params,): pass
function(1, 2, 3, 4, keyword_only=5, a=6, b=7)
# position will be 1
# list_params is positional params with value (2, 3, 4)
# keyword_only will be 5
# dict_params will be {a:6, b:7}
# we cannot put positional arguments after keyword_only arguments
function(1, keyword_only=2, 3, 4)
```
#### Function Annotation
Python 3 provides annotation to function signature
```python
# indicate the type of return val, type and restriction on params
def square(n:'int > 0'=1) -> int:
return n * n
# the annotation will be stored at the __annotation__ attribute
square.__annotation__
```
| 21.250493 | 208 | 0.686282 | eng_Latn | 0.990592 |
3bbf15c4ee454f7bf558daf0bc3e550d1aa0af64 | 2,550 | markdown | Markdown | _posts/2017-12-23-building_a_flashcards_application_in_react_and_rails.markdown | Jschles1/Jschles1.github.io | 7272c822ec3660a4596dfb734ccf6fe894473fb2 | [
"MIT"
] | null | null | null | _posts/2017-12-23-building_a_flashcards_application_in_react_and_rails.markdown | Jschles1/Jschles1.github.io | 7272c822ec3660a4596dfb734ccf6fe894473fb2 | [
"MIT"
] | null | null | null | _posts/2017-12-23-building_a_flashcards_application_in_react_and_rails.markdown | Jschles1/Jschles1.github.io | 7272c822ec3660a4596dfb734ccf6fe894473fb2 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Building A Flashcards Application In React & Rails"
date: 2017-12-23 00:10:44 -0500
permalink: building_a_flashcards_application_in_react_and_rails
---
In the weeks after graduating from Flatiron's online program, I found myself scrambling to cram all the knowledge I had learned relating to JavaScript to prepare for interviews. I found myself using flashcards to help me be better able to describe concepts such as closure, scope, first-class functions, etc. Then I came up with an idea; why not build a flashcards app myself? I would be able to study these concepts while being able to put them into practice at the same time.
Also, I was sort of disappointed with the way my final project, the Workout Planner, turned out. The last minute addition of the Suggested Workouts feature using seed data made it so the app didn't really allow for a later implementation of other features (such as user accounts and authentication) without having to go back and rewrite that feature completely. I was more concerned with meeting the project requirements so I could graduate as soon as possible. I planned to take what I learned from this mistake I made and build my flashcards app structure in a way that I could easily come back and add those features later. I see this flashcards app as sort of a redo of the final project.
For the most part, my Flashcards app structure is almost identical to the Workout Planner's; it implements React and Redux on the front-end, with a Rails API implemented on the back-end. Users can create and remove decks of cards relating to a subject of their choice. Users can then add or remove cards (each with a question and answer) to each deck.
I again used the Semantic-UI CSS framework for styling.
Now, here's the fun part: once a user creates and fills their deck with cards, they can then enter into an interactive quiz game using that deck. Users are first shown the question side of the card. Once they think they know the answer, they click the "Show Answer" button. The card is flipped, and answer to the question is shown. The user then clicks either the "Correct" or "Incorrect" button to document whether they got the question right or not and to move onto the next question. At the end, the user is provided with a percentage score of the questions they got correct.
Here's a screenshot:
![](https://i.imgur.com/5kf2miF.png)
[Here's](https://github.com/Jschles1/react-rails-flashcards) a link to the GitHub project repo if you're interested in checking it out.
| 110.869565 | 692 | 0.787451 | eng_Latn | 0.999742 |
3bbfb8d16b87ff4d7c774e89245f33622d874244 | 319 | md | Markdown | code/README.md | amath-idm/parallelizing-python | 1b6be776ed17bafb13de180ca9ce90f9c240f218 | [
"MIT"
] | null | null | null | code/README.md | amath-idm/parallelizing-python | 1b6be776ed17bafb13de180ca9ce90f9c240f218 | [
"MIT"
] | null | null | null | code/README.md | amath-idm/parallelizing-python | 1b6be776ed17bafb13de180ca9ce90f9c240f218 | [
"MIT"
] | null | null | null | # Code
This folder contains the code for the parallelization examples. Each example uses `model.py`; `model_no_numpy.py` is included as a curious edge case where parallelization is _slower_, for reasons I do not entirely understand! (Something to do with how very long for loops are handled by subprocesses, it seems.) | 106.333333 | 311 | 0.799373 | eng_Latn | 0.999932 |
3bc0bb4821fd9b28eda41ec6c3229bca1e15c35c | 1,242 | md | Markdown | docs/exporting.md | jmgilman/bdantic | 3caa66d681da7a0cf0dbd6481c3f9005a8f2d8b9 | [
"MIT"
] | 3 | 2022-02-02T19:38:59.000Z | 2022-02-16T03:39:50.000Z | docs/exporting.md | jmgilman/bdantic | 3caa66d681da7a0cf0dbd6481c3f9005a8f2d8b9 | [
"MIT"
] | null | null | null | docs/exporting.md | jmgilman/bdantic | 3caa66d681da7a0cf0dbd6481c3f9005a8f2d8b9 | [
"MIT"
] | null | null | null | # Exporting
## Overview
All models (with the exception of the [Account][bdantic.models.realize.Account]
model) contain an [export][bdantic.models.base.Base.export] method which will
produce the equivalent Beancount type from the model. It's essentially reversing
what the [parse][bdantic.models.base.Base.parse] method does:
```python
import bdantic
from beancount.core import amount
from decimal import Decimal
amt = amount.Amount(number=Decimal(1.50), currency="USD"))
parsed_amt = bdantic.parse(amt)
exported_amt = parsed_amt.export()
assert amt == exported_amt
```
The model doesn't need to necessarily be derived from a beancount type in order
for it to be exported. This is a useful feature because it allows creating
Beancount types which are protected by the power of Pydantic validation models:
```python
from bdantic import models
from decimal import Decimal
# ValidationError: value is not a valid decimal (type=type_error.decimal)
amt = models.Amount(number=False, currency="USD").export()
```
## Helper Functions
In addition to the above method, an [export][bdantic.parse.export] function is
provided for exporting the given model. To export a list of models, use the
[export_all][bdantic.parse.export_all] function.
| 30.292683 | 80 | 0.781804 | eng_Latn | 0.980764 |
3bc0f9349b54415b00c61f6529461236859290ff | 218 | md | Markdown | _watches/M20210330_081055_TLP_2.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-05-19T17:04:49.000Z | 2021-03-30T03:09:14.000Z | _watches/M20210330_081055_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20210330_081055_TLP_2.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP2 - 30/03/2021 - M20210330_081055_TLP_2T.jpg
date: 2021-03-30 08:10:55
permalink: /2021/03/30/watch/M20210330_081055_TLP_2
capture: TLP2/2021/202103/20210329/M20210330_081055_TLP_2T.jpg
---
| 27.25 | 62 | 0.784404 | fra_Latn | 0.03341 |
3bc121b58be198a078a19493db17959a14349458 | 400 | md | Markdown | README.md | Chazzz/vintageous-custom-keybindings | 7465a354bac225499022e5ca514646da865c80e9 | [
"MIT"
] | null | null | null | README.md | Chazzz/vintageous-custom-keybindings | 7465a354bac225499022e5ca514646da865c80e9 | [
"MIT"
] | null | null | null | README.md | Chazzz/vintageous-custom-keybindings | 7465a354bac225499022e5ca514646da865c80e9 | [
"MIT"
] | null | null | null | # vintageous-custom-keybindings
custom keybindings for vintageous
Vintageous is advertised as not supporting remapped keybindings. This may be true, but it's not impossible! I have provided here a brief example of how I remapped two vintageous keys to two different keys.
If you have a more complicated setup which you would like to share (remapping capital letters, e.g.), send me a pull request!
| 57.142857 | 205 | 0.8025 | eng_Latn | 0.999918 |
3bc22150e7c311547af67d76463d72ec408eef6e | 31 | md | Markdown | README.md | yinzeqiang/spring-learning | 811a69da56610f75e21a2f9b2182d98ad93e29bc | [
"Apache-2.0"
] | null | null | null | README.md | yinzeqiang/spring-learning | 811a69da56610f75e21a2f9b2182d98ad93e29bc | [
"Apache-2.0"
] | null | null | null | README.md | yinzeqiang/spring-learning | 811a69da56610f75e21a2f9b2182d98ad93e29bc | [
"Apache-2.0"
] | null | null | null | # spring-learning
我的spring学习笔记
| 10.333333 | 17 | 0.83871 | eng_Latn | 0.347385 |