Page 1 of 3

Face Identification and Add to Server Database

PostPosted: Apr 29, 2010 09:33
by kaiyum
We have some Photos of few persion on our server(treated as Template) and will supply a picture of a persion. The program should identify the persion compared to the templates on the folder and will save to Database if matched.
We may give input as Still image or stream.
Please suggest which functionality should i use like you have in tutorials and samples in SDK.

Admin: merged to "Matching questions"

Re: Face Identification and Add to Server Database

PostPosted: Apr 30, 2010 06:42
by Martynas
kaiyum wrote:We have some Photos of few persion on our server(treated as Template) and will supply a picture of a persion. The program should identify the persion compared to the templates on the folder and will save to Database if matched.
We may give input as Still image or stream.
Please suggest which functionality should i use like you have in tutorials and samples in SDK.

Hello,

What SDK are you using?

For face extraction from image and from the stream, please, consider to use tutorials "FacesEnrollFromImage" and "FacesEnrollFromStream".
If you are using NServer (is included in VeriLook Extended SDK and in MegaMatcher SDK) for matching, then consider the tutorial "SendTaskToServer" to see how the matching task is sent to NServer.
If you are using NMatcher for identification, then consider tutorial "FacesIdentify" tutorial.

Matching questions

PostPosted: Jun 13, 2011 14:10
by PRB
Good Afternoon!

I'm executing "AbisSample" in C# to test the matching process of two or more faces.

I have created 3 template:
- 2 with my faces (so i think the match will be correct).
- 1 with the face of another person (so i thing the match will not be correct).

Using the "verification" button of the form and loading the template of my face and matching it with the element of my face in the local database i obtain this result:

Matched with id=<Template ID>, score=146
Face match details: face index: 1; score: 146
face index: 0; score: 127;
face index: 0; score: 155;


Is this the message that indicates the correct match between the two faces?

And the next result indicates that the match in not correct?

Verified against id==<Template ID>,
Face match details: face index: -1; score: 0
face index: -1; score: 0;


I hope you can clear my doubt.

Thanx.

Admin: renamed to "Face matching questions" as the topic is more about face matching than using the C# sample

Re: C# AbisSample

PostPosted: Jun 14, 2011 08:44
by vaidasz
PRB wrote:Good Afternoon!

I'm executing "AbisSample" in C# to test the matching process of two or more faces.

I have created 3 template:
- 2 with my faces (so i think the match will be correct).
- 1 with the face of another person (so i thing the match will not be correct).

Using the "verification" button of the form and loading the template of my face and matching it with the element of my face in the local database i obtain this result:

Matched with id=<Template ID>, score=146
Face match details: face index: 1; score: 146
face index: 0; score: 127;
face index: 0; score: 155;


Is this the message that indicates the correct match between the two faces?

And the next result indicates that the match in not correct?

Verified against id==<Template ID>,
Face match details: face index: -1; score: 0
face index: -1; score: 0;


I hope you can clear my doubt.

Thanx.


Hello,

Yes, you are right. But as I see in the first matching, you perform not verification but identification (1:N matching) while the second matching is performed as verification (1:1 matching).

Re: C# AbisSample

PostPosted: Jun 14, 2011 09:01
by PRB
vaidasz wrote:
PRB wrote:Good Afternoon!

I'm executing "AbisSample" in C# to test the matching process of two or more faces.

I have created 3 template:
- 2 with my faces (so i think the match will be correct).
- 1 with the face of another person (so i thing the match will not be correct).

Using the "verification" button of the form and loading the template of my face and matching it with the element of my face in the local database i obtain this result:

Matched with id=<Template ID>, score=146
Face match details: face index: 1; score: 146
face index: 0; score: 127;
face index: 0; score: 155;


Is this the message that indicates the correct match between the two faces?

And the next result indicates that the match in not correct?

Verified against id==<Template ID>,
Face match details: face index: -1; score: 0
face index: -1; score: 0;


I hope you can clear my doubt.

Thanx.


Hello,

Yes, you are right. But as I see in the first matching, you perform not verfication but identification (1:N matching) while the second matching is berformed as verification (1:1 matching).


Hello Vaidasz! Thanx for the answer!

You're right, i've posted the result message from two different operations.

So, if the verification process verify that two faces are not from the same person the "score" value will be always "0"?
Otherwise, if two faces are from the same person the "score" value is more than 0? The "score" value is higher depending on how two faces are equals?

Thanx!

Re: C# AbisSample

PostPosted: Jun 14, 2011 13:09
by vaidasz
PRB wrote:
vaidasz wrote:
PRB wrote:Good Afternoon!

I'm executing "AbisSample" in C# to test the matching process of two or more faces.

I have created 3 template:
- 2 with my faces (so i think the match will be correct).
- 1 with the face of another person (so i thing the match will not be correct).

Using the "verification" button of the form and loading the template of my face and matching it with the element of my face in the local database i obtain this result:

Matched with id=<Template ID>, score=146
Face match details: face index: 1; score: 146
face index: 0; score: 127;
face index: 0; score: 155;


Is this the message that indicates the correct match between the two faces?

And the next result indicates that the match in not correct?

Verified against id==<Template ID>,
Face match details: face index: -1; score: 0
face index: -1; score: 0;


I hope you can clear my doubt.

Thanx.


Hello,

Yes, you are right. But as I see in the first matching, you perform not verfication but identification (1:N matching) while the second matching is berformed as verification (1:1 matching).


Hello Vaidasz! Thanx for the answer!

You're right, i've posted the result message from two different operations.

So, if the verification process verify that two faces are not from the same person the "score" value will be always "0"?
Otherwise, if two faces are from the same person the "score" value is more than 0? The "score" value is higher depending on how two faces are equals?

Thanx!


If the face is not matched the result will be 0.
The result returned when matching is probability score that the face is matched.

Re: C# AbisSample

PostPosted: Jun 14, 2011 13:22
by PRB
For example i had sometimes a return value of "146" and sometimes a return value of "9634" when matching a ".dat" file with a template in the local database.
I've obtained the result of "9634" matching a ".dat" file with its own template (generate in the same moment i saved ".dat" file on my PC's file system).
If the ".dat" file and the template refers to the same face what is the meaning of "9634" result? Does not match for a 100% percentage?

Re: C# AbisSample

PostPosted: Jun 15, 2011 08:49
by vaidasz
PRB wrote:For example i had sometimes a return value of "146" and sometimes a return value of "9634" when matching a ".dat" file with a template in the local database.
I've obtained the result of "9634" matching a ".dat" file with its own template (generate in the same moment i saved ".dat" file on my PC's file system).
If the ".dat" file and the template refers to the same face what is the meaning of "9634" result? Does not match for a 100% percentage?


Hello,

First of all, please use score returned by NMatcher - 9634.
9634 score is returned on how similar face is compared to the features extracted from the face. You won't be able to get the same, 9634, score all the time while matching the same image. Because there won't be the same amount of features found and extracted from face.
You can always read the documentation (provided with SDK) section "5.1.1 Matching Threshold and Score".

Matching Score [Face Match]

PostPosted: Sep 21, 2011 06:34
by suleha
Hi,

I'm having problem extracting the matching score using NServer.
In SendTask tutorial:
Code: Select all
code = cluster_get_results_data(clientHandle, i, id_len, idStr, md_len, mdStr);

Is it correct that mdStr is the matching score? I tried to print out the value, but it return the same value for 3 difference ID returned (the image sent to be matched contains 3 faces).

Another question. Is there any accuracy difference between using NServer and NMatcher?
I ran FacesSamplesWX and AbisSampleCS, enrolled with the same faces. And then I tried to identify the same image file on both applications.
I found that face detection in AbisSampleCS is not as good as FacesSamplesWX (I set all extraction, detection and identification settings to be the same value).

Admin: merged to "Face matching questions"

Re: Matching Score

PostPosted: Sep 21, 2011 13:22
by Martynas
suleha wrote:Hi,

I'm having problem extracting the matching score using NServer.
In SendTask tutorial:
Code: Select all
code = cluster_get_results_data(clientHandle, i, id_len, idStr, md_len, mdStr);

Is it correct that mdStr is the matching score? I tried to print out the value, but it return the same value for 3 difference ID returned (the image sent to be matched contains 3 faces).


Hello,

to get the matching score add the code below after the code where the id is returned.
Code: Select all
NInt score = 0;
code = cluster_get_results_similarity(clientHandle, 0, id_len, idStr, &score);


mdStr is a pointer to matching details, which are returned, when the CLUSTER_DETAILS paramater is set. If you need to get matching details, then you need to change the CLUSTER_NORMAL to CLUSTER_DETAILS in the code below.
Code: Select all
   
code = cluster_packet_create_standard_template_task(clientHandle, CLUSTER_NORMAL, (int)nTemplateBufferSize, nTemplateBuffer, templateType,
(int)strlen(query), query,
cluster_task_params_get_param_buffer_size(cluster_task_params),
cluster_task_params_get_param_buffer(cluster_task_params), 100);

Also you will need to use NMMatchDetailsDeserialize function to deserialize received details.

What regards the accuracy of the NServer and NMatcher - it is the same, because the NServer uses the NMatcher.

suleha wrote:Another question. Is there any accuracy difference between using NServer and NMatcher?
I ran FacesSamplesWX and AbisSampleCS, enrolled with the same faces. And then I tried to identify the same image file on both applications.
I found that face detection in AbisSampleCS is not as good as FacesSamplesWX (I set all extraction, detection and identification settings to be the same value).

Is it possible to get the data and detailed steps to reproduce this issue on our side. Actually there should be no differences.

Re: Matching Score

PostPosted: Sep 27, 2011 02:08
by suleha
Hi,

I have another question regarding matching score.
How do I get matching score for other matched image using NServer.
As an example, I have 1000 people in the database, and ID#3 is identified correctly... But there are other people who might have similar face features as people ID#3 who also has quite high score (although not the highest).
I would like to have this ID information with its score.

Another one, how do I get the score in terms of percentage? What is the maximum value of a score?

Re: Matching Score

PostPosted: Sep 27, 2011 10:51
by Martynas
Hello,

suleha wrote:How do I get matching score for other matched image using NServer.
As an example, I have 1000 people in the database, and ID#3 is identified correctly... But there are other people who might have similar face features as people ID#3 who also has quite high score (although not the highest).
I would like to have this ID information with its score.


When you are sending the matching task to server you need to define the matching threshold. During the matching if the matching score is higher or equal to the matching threshold the score and the id is returned of each match which satisfied the threshold. Thus you can get several ids and matches if there multiple matches. If the score is lower then the matching threshold, then the score is set as 0 and it is not returned.
If you need to get more results, then you need to set the lower matching threshold. Then more results will be returned. i.e. if you will set matching threshold as 0, then all ids from the database will be returned with matching scores higher then 0.

suleha wrote:Another one, how do I get the score in terms of percentage? What is the maximum value of a score?

There is no possibility to convert the score to the percents, because the score is not linear value and it does not have a maximum value. The matching score represents the probability of the false acceptance. The higher the matching score the higher the probability that matched fingerprints are of the same person.
If it is needed you can convert the matching score to the False Acceptance Rate, which is in percents, to see the possibility that false acceptance occurred. The conversion equation is provided in the documentation of the SDK, Section "Matching threshold and score".

VeriLook matching score

PostPosted: Feb 16, 2012 15:22
by PRB
Hello Vaidasz!
I'm sorry for the great time passed from the last time i've asked you information about "Very Look".
There are some detail that i've not too much clear. I've acquired three templates of two different faces using digital camera integrated in my notebook.
After that i've launched "verification" feature obtaing these results:
Code: Select all
Verified against id=TEST - 001
    Face match details: face index: 0; score: 9322
    face index: 0; score: 9334;
    face index: -1; score: 0;
    face index: -1; score: 0;
    face index: -1; score: 0;

Verified against id=RIMOLDI - 001
    Face match details: face index: 3; score: 8
    face index: 1; score: 15;
    face index: -1; score: 0;
    face index: -1; score: 0;
    face index: 0; score: 20;

Verified against id=RIMOLDI - 002
    Face match details: face index: 1; score: 0
    face index: 0; score: 9;
    face index: 0; score: 11;
    face index: -1; score: 0;
    face index: -1; score: 0;


In a previous post you said me that the score indicates the probability that two faces matches.
So, in the first example, what means 9322? Is that a percentual value? I can "read" that value as "93,22%"?
In the other example "score: 0" means that two faces does not match. But "score: 8" means that that faces match with a probability of 8%?

Thanx another time for you patience.

Re: VeriLook matching score

PostPosted: Feb 17, 2012 10:11
by vaidasz
Hello,

So, in the first example, what means 9322? Is that a percentual value? I can "read" that value as "93,22%"?
In the other example "score: 0" means that two faces does not match. But "score: 8" means that that faces match with a probability of 8%?

No, it is not correct interpretation.
The better face picture is used for template extraction, the more features will be extracted. And that leads to higher matching score which is unlimited. So score 9322 does not mean that face is matached 93,22%. It just shows the score using to match 2 tempaltes.
Returned score depends on matching threshold you have set. The smaller score is set as a threshold the higher FAR is received. More info about matching threshold and FAR could be found in documentation section "5.1.1 Matching Threshold and FAR/FRR". Which threshold you have to set depends on your rerquirements.

Faces comparision

PostPosted: Apr 21, 2012 21:53
by eru_iluvatar
Hi,

in part of my bachelor thesis I need to compare 2 faces of the same person- I dont have any other photos of that person, so I need to compare just these 2 photos and decide, how much do they look like each other - if its realy photos of the same person or not.
I am using this code:

Code: Select all
       NBuffer probeTemplate = ExtractTemplate(extractor, img1, false);
                NBuffer[] galleryTemplates = new NBuffer[1];
                galleryTemplates[0] = ExtractTemplate(extractor, img2, false);

                NMatchingDetails details;
                matcher.Verify(probeTemplate, galleryTemplates[0], out details);


So if I look into details variable - there is FacesScore property - it gives me, for example 37 - does mean, that similarity between these 2 faces are 37% or what? If not, how can I get their look likeness percentage?
Thanks a lot, for fast reply!

Admin: merged to "Face matching questions"

Faces comparision

PostPosted: Apr 21, 2012 22:01
by eru_iluvatar
Hi,

in part of my bachelor thesis I need to compare 2 photos - if they contain the same person or not - Show how much do they look like, for example in percentage.
I am using this code:

Code: Select all
NBuffer probeTemplate = ExtractTemplate(extractor, img1, false);
NBuffer[] galleryTemplates = new NBuffer[1];
galleryTemplates[0] = ExtractTemplate(extractor, img2, false);
NMatchingDetails details; 
matcher.Verify(probeTemplate, galleryTemplates[0], out details);


Variable details contains FacesScore property, and its for example 37. What does it mean? Its 37% similarity of faces or what? If not, how can I get how much do faces look like each other?
Thanks a lot for a fast reply!

Re: Faces comparision

PostPosted: Apr 23, 2012 13:06
by Martynas
eru_iluvatar wrote:Hi,
in part of my bachelor thesis I need to compare 2 photos - if they contain the same person or not - Show how much do they look like, for example in percentage.
I am using this code:
...
Variable details contains FacesScore property, and its for example 37. What does it mean? Its 37% similarity of faces or what? If not, how can I get how much do faces look like each other?
Thanks a lot for a fast reply!

Hello,

the Verify function returns the matching score, which shows the probability of the False Acceptance. It is not possible to convert this value to percentage, because the matching score is not a linear value and it does not have a maximum limit.
You can find more information on matching score in the documentation of the SDK, section "Matching threshold and score".

Re: Matching Score

PostPosted: Nov 14, 2012 09:47
by suleha
Hi,

I couldn't recall where I got this equation/function:
Code: Select all
double MatchingThresholdToFAR(int score, int ThFARLogRatio)
{
   if(score < 0) th = 0;
   return pow(10.0, -score / ThFARLogRatio + 2);
}

'score' refers to matching score value returned by the matching server (NServer), while ThFARLogRatio is by referring to Table in chapter 4.1.1 Matching Threshold and FAR/FRR of the SDK documentation (12, 24, 36 etc).
In order to get the matching accuracy, I did this: accuracy = 100 - scoreFAR;

Is this correct?

Re: Matching Score

PostPosted: Nov 14, 2012 13:27
by Martynas
suleha wrote:Hi,

I couldn't recall where I got this equation/function:
Code: Select all
double MatchingThresholdToFAR(int score, int ThFARLogRatio)
{
   if(score < 0) th = 0;
   return pow(10.0, -score / ThFARLogRatio + 2);
}

'score' refers to matching score value returned by the matching server (NServer), while ThFARLogRatio is by referring to Table in chapter 4.1.1 Matching Threshold and FAR/FRR of the SDK documentation (12, 24, 36 etc).
In order to get the matching accuracy, I did this: accuracy = 100 - scoreFAR;

Is this correct?

the accuracy can not be calculated pragmatically, as it is a relation between the FAR and FRR and it is not possible to calculate the FRR.

Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 27, 2012 15:57
by andresvergara
Hello,

I use VeriLook 5.2, i tried to set the FacesMatchingSpeed property value but it not change.
the code is as follows
Code: Select all
 _matcher = new NMatcher();
MessageBox.Show("Matcher Speed Before:" + _matcher.FacesMatchingSpeed);
_matcher.FacesMatchingSpeed = nmsFacesMatchingSpeed;
MessageBox.Show("Matcher Speed After:" + _matcher.FacesMatchingSpeed);

nmsFacesMatchingSpeed is NMatchingSpeed.High and the messages shows "Matcher Speed Before: Low" and "Matcher Speed After: Low"

Please...help

Admin: merged to "Face matching questions"

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 28, 2012 09:18
by Martynas
andresvergara wrote:Hello,

I use VeriLook 5.2, i tried to set the FacesMatchingSpeed property value but it not change.
the code is as follows
Code: Select all
 _matcher = new NMatcher();
MessageBox.Show("Matcher Speed Before:" + _matcher.FacesMatchingSpeed);
_matcher.FacesMatchingSpeed = nmsFacesMatchingSpeed;
MessageBox.Show("Matcher Speed After:" + _matcher.FacesMatchingSpeed);

nmsFacesMatchingSpeed is NMatchingSpeed.High and the messages shows "Matcher Speed Before: Low" and "Matcher Speed After: Low"

Please...help

Hi,

to be able to set the matching speed as High, you need to use "Fast Face Matcher" license, which is available for customer who have purchased the MegaMatcher Standard/Extended SDK. VeriLook Standard/Extended SDK have only the "Face Matcher" license and it allows to set only the "Low" matching speed.

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 28, 2012 14:35
by andresvergara
thank you very much for your reply, and I have another question regarding FacesMatchingThreshold property.
The output score _matcher.IdentifyNext method varies according to the value entered in _matcher.FacesMatchingThreshold?

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 28, 2012 14:38
by Martynas
Hi,

andresvergara wrote:The output score _matcher.IdentifyNext method varies according to the value entered in _matcher.FacesMatchingThreshold?


What do you mean by saying "varies"? Could you provide more details.

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 28, 2012 18:59
by andresvergara
Martynas wrote:Hi,

andresvergara wrote:The output score _matcher.IdentifyNext method varies according to the value entered in _matcher.FacesMatchingThreshold?


What do you mean by saying "varies"? Could you provide more details.


excuse my English.......In the Neurotec_Biometric_SDK_Documentation.pdf say
"Matching Threshold: the minimum score that verification and identification functions accept to assume that the compared face belog to the same person."

My question is
Use The _matcher.IdentifyNext method internally the _matcher.FacesMatchingThreshold parameter for calculate the output score?

If _matcher.FacesMatchingThreshold is highest or smallest, how influences in that score?

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 29, 2012 07:58
by Martynas
andresvergara wrote:excuse my English.......In the Neurotec_Biometric_SDK_Documentation.pdf say
"Matching Threshold: the minimum score that verification and identification functions accept to assume that the compared face belog to the same person."

My question is
Use The _matcher.IdentifyNext method internally the _matcher.FacesMatchingThreshold parameter for calculate the output score?

If _matcher.FacesMatchingThreshold is highest or smallest, how influences in that score?

Hello,

if the matching score is equal or bigger than the matching threshold, then the result is kept as matched and is returned.
If the matching score is lower than the matching threshold, then the result is kept as not matched and zero value is returned.

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 29, 2012 15:04
by andresvergara
Hi,
then, is correct the following lines in my code?
Code: Select all
_matcher.IdentifyStart(templateA);
score = _matcher.IdentifyNext(templateB);
                       
                        if (score > _matcher.FacesMatchingThreshold)
                        {
                           MessageBox.Show("Match template");
                        }else{
                           MessageBox.Show("Not Match template");
                        }


with a FAR=100%, any score > 0 ------>"Match template"
with a FAR=1%, any score>24 --------->"Match template"

is correct?

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 30, 2012 07:26
by Martynas
andresvergara wrote:Hi,
then, is correct the following lines in my code?
Code: Select all
_matcher.IdentifyStart(templateA);
score = _matcher.IdentifyNext(templateB);
                       
                        if (score > _matcher.FacesMatchingThreshold)
                        {
                           MessageBox.Show("Match template");
                        }else{
                           MessageBox.Show("Not Match template");
                        }


with a FAR=100%, any score > 0 ------>"Match template"
with a FAR=1%, any score>24 --------->"Match template"

is correct?

yes, this is correct.

Re: Property FacesMatchingSpeed not set [Face Match]

PostPosted: Nov 30, 2012 14:47
by andresvergara
Thank you, Martynas

VeriFinger 6.5 Matching Issue [Match]

PostPosted: Jan 11, 2013 20:13
by KimNguyen
I am using the full Neurotechnology suite to perform matching on a face, two irises, and two fingers. The logic is the same for each modality, but I'm having issues with getting a second finger to match. Essentially a list of objects in another class is iterated on and each template is passed to the performSearch method that I'll list below. In a previous version of this code that used VF 6.2, this implementation worked. But now, it is not working. Any thoughts? Below is the stack trace from the event viewer:

Code: Select all
Error processing search request for subject 0 RightIndex
Stack Trace:
Neurotec.NInvalidOperationException: Identification is not started
   at Neurotec.NResult.RaiseError(Int32 error)
   at Neurotec.NResult.Check(Int32 result)
   at Neurotec.Biometrics.NMatcher.IdentifyEnd()
   at Program.Verifinger.VeriFinger65.performSearch(BTemplate template)
   at Program.Verifinger.VeriFingerService.searchThread(Object instance)


Code: Select all
CATCH:    at Neurotec.NResult.RaiseError(Int32 error)
   at Neurotec.NResult.Check(Int32 result)
   at Neurotec.Biometrics.NMatcher.IdentifyStart(NBuffer template)
   at Program.Verifinger.VeriFinger65.performSearch(BTemplate template)


Below is the code:
Code: Select all
public static List<IMatcherResults> performSearch(BTemplate template)
        {

            List<IMatcherResults> results = new List<IMatcherResults>();

            NMatcher matcher = null;
            initializeMatcher(out matcher);

            NTemplate probeTemplate = null;
            try
            {
                Modality modality = template.Modality;

                probeTemplate = new NTemplate(template.Template);

                matcher.IdentifyStart(probeTemplate.Save());

                DTemplate[] templates;
                switch (modality)
                {
                    case Modality.LeftIndex:
                        templates = targets7.getGalleryArray();
                        break;
                    case Modality.RightIndex:
                        templates = targets2.getGalleryArray();
                        break;
                    default:
                        probeTemplate.Dispose();
                        return results;
                }

                for (int i = 0; i < templates.Length; i++)
                {
                    NTemplate targetTemplate = null;
                    try
                    {
                        targetTemplate = new NTemplate(templates[i].Template);
                        int score = matcher.IdentifyNext(targetTemplate.Save());
                        if (score > 0)
                        {
                            VeriFinger65SearchResults result = new VeriFinger65SearchResults();
                            result.ProbeTemplate = template;
                            result.TargetTemplate = templates[i];
                            result.Score = score;
                            results.Add(result);
                        }
                        targetTemplate.Dispose();
                    }
                    catch
                    {
                        if (targetTemplate != null)
                            targetTemplate.Dispose();
                    }
                }
            }
            catch (Exception ex)
            {
                MatcherEventLog.reportLog("CATCH: " + ex.StackTrace.ToString());
                throw new VeriFingerException("Error occurred when performing a search", ex);
            }
            finally
            {
                if (matcher != null)
                {
                    matcher.IdentifyEnd();
                    matcher.Dispose();
                }
                if (probeTemplate != null)
                {
                    probeTemplate.Dispose();
                }
            }
            return results;
        }


The only thing that has really changed is VF version and before any of the searching happens I submit all the fingerprint templates to the service for quality checking (previously I submitted a smaller subset). It appears the error is happening WITHIN Neuro so I can't even see exactly what the issue is. Any thoughts?

Admin: merged to "Matching questions"

Re: VeriFinger 6.5 Matching Issue

PostPosted: Jan 14, 2013 14:40
by Martynas
KimNguyen wrote:I am using the full Neurotechnology suite to perform matching on a face, two irises, and two fingers. The logic is the same for each modality, but I'm having issues with getting a second finger to match. Essentially a list of objects in another class is iterated on and each template is passed to the performSearch method that I'll list below. In a previous version of this code that used VF 6.2, this implementation worked. But now, it is not working. Any thoughts? Below is the stack trace from the event viewer:

Code: Select all
Error processing search request for subject 0 RightIndex
Stack Trace:
Neurotec.NInvalidOperationException: Identification is not started
   at Neurotec.NResult.RaiseError(Int32 error)
   at Neurotec.NResult.Check(Int32 result)
   at Neurotec.Biometrics.NMatcher.IdentifyEnd()
   at Program.Verifinger.VeriFinger65.performSearch(BTemplate template)
   at Program.Verifinger.VeriFingerService.searchThread(Object instance)


Code: Select all
CATCH:    at Neurotec.NResult.RaiseError(Int32 error)
   at Neurotec.NResult.Check(Int32 result)
   at Neurotec.Biometrics.NMatcher.IdentifyStart(NBuffer template)
   at Program.Verifinger.VeriFinger65.performSearch(BTemplate template)


Below is the code:
Code: Select all
public static List<IMatcherResults> performSearch(BTemplate template)
        {

            List<IMatcherResults> results = new List<IMatcherResults>();

            NMatcher matcher = null;
            initializeMatcher(out matcher);

            NTemplate probeTemplate = null;
            try
            {
                Modality modality = template.Modality;

                probeTemplate = new NTemplate(template.Template);

                matcher.IdentifyStart(probeTemplate.Save());

                DTemplate[] templates;
                switch (modality)
                {
                    case Modality.LeftIndex:
                        templates = targets7.getGalleryArray();
                        break;
                    case Modality.RightIndex:
                        templates = targets2.getGalleryArray();
                        break;
                    default:
                        probeTemplate.Dispose();
                        return results;
                }

                for (int i = 0; i < templates.Length; i++)
                {
                    NTemplate targetTemplate = null;
                    try
                    {
                        targetTemplate = new NTemplate(templates[i].Template);
                        int score = matcher.IdentifyNext(targetTemplate.Save());
                        if (score > 0)
                        {
                            VeriFinger65SearchResults result = new VeriFinger65SearchResults();
                            result.ProbeTemplate = template;
                            result.TargetTemplate = templates[i];
                            result.Score = score;
                            results.Add(result);
                        }
                        targetTemplate.Dispose();
                    }
                    catch
                    {
                        if (targetTemplate != null)
                            targetTemplate.Dispose();
                    }
                }
            }
            catch (Exception ex)
            {
                MatcherEventLog.reportLog("CATCH: " + ex.StackTrace.ToString());
                throw new VeriFingerException("Error occurred when performing a search", ex);
            }
            finally
            {
                if (matcher != null)
                {
                    matcher.IdentifyEnd();
                    matcher.Dispose();
                }
                if (probeTemplate != null)
                {
                    probeTemplate.Dispose();
                }
            }
            return results;
        }


The only thing that has really changed is VF version and before any of the searching happens I submit all the fingerprint templates to the service for quality checking (previously I submitted a smaller subset). It appears the error is happening WITHIN Neuro so I can't even see exactly what the issue is. Any thoughts?

Hello,

could you provide a small test application and detailed description of this issue, along with the SDK Revision number to support@neurotechnology.com?

Re: VeriFinger 6.5 Matching Issue

PostPosted: Jan 18, 2013 14:50
by KimNguyen
KimNguyen wrote:I am using the full Neurotechnology suite to perform matching on a face, two irises, and two fingers. The logic is the same for each modality, but I'm having issues with getting a second finger to match. Essentially a list of objects in another class is iterated on and each template is passed to the performSearch method that I'll list below. In a previous version of this code that used VF 6.2, this implementation worked. But now, it is not working. Any thoughts? Below is the stack trace from the event viewer:

Code: Select all
Error processing search request for subject 0 RightIndex
Stack Trace:
Neurotec.NInvalidOperationException: Identification is not started
   at Neurotec.NResult.RaiseError(Int32 error)
   at Neurotec.NResult.Check(Int32 result)
   at Neurotec.Biometrics.NMatcher.IdentifyEnd()
   at Program.Verifinger.VeriFinger65.performSearch(BTemplate template)
   at Program.Verifinger.VeriFingerService.searchThread(Object instance)


Code: Select all
CATCH:    at Neurotec.NResult.RaiseError(Int32 error)
   at Neurotec.NResult.Check(Int32 result)
   at Neurotec.Biometrics.NMatcher.IdentifyStart(NBuffer template)
   at Program.Verifinger.VeriFinger65.performSearch(BTemplate template)


Below is the code:
Code: Select all
public static List<IMatcherResults> performSearch(BTemplate template)
        {

            List<IMatcherResults> results = new List<IMatcherResults>();

            NMatcher matcher = null;
            initializeMatcher(out matcher);

            NTemplate probeTemplate = null;
            try
            {
                Modality modality = template.Modality;

                probeTemplate = new NTemplate(template.Template);

                matcher.IdentifyStart(probeTemplate.Save());

                DTemplate[] templates;
                switch (modality)
                {
                    case Modality.LeftIndex:
                        templates = targets7.getGalleryArray();
                        break;
                    case Modality.RightIndex:
                        templates = targets2.getGalleryArray();
                        break;
                    default:
                        probeTemplate.Dispose();
                        return results;
                }

                for (int i = 0; i < templates.Length; i++)
                {
                    NTemplate targetTemplate = null;
                    try
                    {
                        targetTemplate = new NTemplate(templates[i].Template);
                        int score = matcher.IdentifyNext(targetTemplate.Save());
                        if (score > 0)
                        {
                            VeriFinger65SearchResults result = new VeriFinger65SearchResults();
                            result.ProbeTemplate = template;
                            result.TargetTemplate = templates[i];
                            result.Score = score;
                            results.Add(result);
                        }
                        targetTemplate.Dispose();
                    }
                    catch
                    {
                        if (targetTemplate != null)
                            targetTemplate.Dispose();
                    }
                }
            }
            catch (Exception ex)
            {
                MatcherEventLog.reportLog("CATCH: " + ex.StackTrace.ToString());
                throw new VeriFingerException("Error occurred when performing a search", ex);
            }
            finally
            {
                if (matcher != null)
                {
                    matcher.IdentifyEnd();
                    matcher.Dispose();
                }
                if (probeTemplate != null)
                {
                    probeTemplate.Dispose();
                }
            }
            return results;
        }


The only thing that has really changed is VF version and before any of the searching happens I submit all the fingerprint templates to the service for quality checking (previously I submitted a smaller subset). It appears the error is happening WITHIN Neuro so I can't even see exactly what the issue is. Any thoughts?

The issue was resolved. In an initialize method I was calling ObtainComponents, and then for the extraction and matching process I was calling obtainComponents again. Once I removed the extra call to ObtainComponents in the regular initialize method everything worked fine.

Similarity score [Match]

PostPosted: Feb 01, 2013 12:56
by TIlles
Hello everyone!

We have been working for a few weeks now on upgrading from VeriFinger 4.3 to 6.5 in our Java application. We store fingerprints as generalized templates made of three prints in a database. When someone is trying to login to the system, we read her fingerprint and compare it to the stored fingerprints, thus we receive a similarity score.

We want to use a fix similarity score as the base of the identification, eg. if the resulting similarity score is greater than X, then the person can log in, but we have no idea what to use as X. We get a lot of different values as result, some as low as 200, some as high as 2200. Usually we get lower results if we compare prints and templates made with different scanners. We have a digitalPersona U.are.U 4000B and a 4500 scanner, using the former usually results in lower similarity scores.

Can you recommend a "safe" value for "X" that we can use with all scanners?

Admin: merged to "Matching questions"

Re: Similarity score [Match]

PostPosted: Feb 04, 2013 08:02
by Martynas
TIlles wrote:Hello everyone!

We have been working for a few weeks now on upgrading from VeriFinger 4.3 to 6.5 in our Java application. We store fingerprints as generalized templates made of three prints in a database. When someone is trying to login to the system, we read her fingerprint and compare it to the stored fingerprints, thus we receive a similarity score.

We want to use a fix similarity score as the base of the identification, eg. if the resulting similarity score is greater than X, then the person can log in, but we have no idea what to use as X. We get a lot of different values as result, some as low as 200, some as high as 2200. Usually we get lower results if we compare prints and templates made with different scanners. We have a digitalPersona U.are.U 4000B and a 4500 scanner, using the former usually results in lower similarity scores.

Can you recommend a "safe" value for "X" that we can use with all scanners?

Hello,

the similarity score represents the probability that the false acceptance has occurred. The higher the score, the lower the probability that the false acceptance occurred.
The matching algorithm uses the matching threshold to filter the matching results. If the similarity score is equal or higher than the set matching threshold, then the score result is returned and it is kept that the matching succeed. If the score is lower than the matching threshold, then the result as 0 is returned and it is kept that fingerprints did not match.
Usage of different fingerprint scanners will result in slower similarity scores, because of the different sensors used in different scanners. Each sensor has its own distortions and this leads to slightly different extracted minutiae position, and it can lead to lower matching scores.
We can not say the exact matching threshold, which will be safe - usually it is defined in the project requirements or are set by using a filed test results. What we can say is that the default matching threshold of 48 is quite low and it is not recommended to be used.

Re: Similarity score [Match]

PostPosted: Feb 05, 2013 14:21
by TIlles
Martynas wrote:the similarity score represents the probability that the false acceptance has occurred. The higher the score, the lower the probability that the false acceptance occurred.
The matching algorithm uses the matching threshold to filter the matching results. If the similarity score is equal or higher than the set matching threshold, then the score result is returned and it is kept that the matching succeed. If the score is lower than the matching threshold, then the result as 0 is returned and it is kept that fingerprints did not match.
Usage of different fingerprint scanners will result in slower similarity scores, because of the different sensors used in different scanners. Each sensor has its own distortions and this leads to slightly different extracted minutiae position, and it can lead to lower matching scores.
We can not say the exact matching threshold, which will be safe - usually it is defined in the project requirements or are set by using a filed test results. What we can say is that the default matching threshold of 48 is quite low and it is not recommended to be used.

I see, this clarified a lot for us. Thank you for the quick reply!

Finger matching [Match]

PostPosted: Mar 25, 2013 06:44
by sgjoshi
Hi,

I have downloaded the SDK, eclipse and Maven. I followed the instructions and was able to build the samples. I can even run those from command line on Windows. My interest is in fingerprint matching and here is what I did:

1. I ran the "simple-fingers-sample". The Swing application was able to register finger, create a template and save image too.
2. Now I want to create a new application - on the lines of tutorials provided. However, I am not familiar at all with Maven and hence have not been able to compile and run the tutorials.
3.There is a tutorial for verifying finger, which works on command prompt and prints out results of matching. However, this tutorial code reads templates or images. What I want is this:
a: register a finger and create template
b: ask the user to put his/her finger on scanner
c: get the image from scanner and then match with template and print the results

This may sound too simple, but can someone guide me?

Thanks,
Sudhir

Admin: merged to "Matching questions"

Re: Finger matching [Java][Match]

PostPosted: Mar 25, 2013 15:10
by Martynas
sgjoshi wrote:Hi,

I have downloaded the SDK, eclipse and Maven. I followed the instructions and was able to build the samples. I can even run those from command line on Windows. My interest is in fingerprint matching and here is what I did:

1. I ran the "simple-fingers-sample". The Swing application was able to register finger, create a template and save image too.
2. Now I want to create a new application - on the lines of tutorials provided. However, I am not familiar at all with Maven and hence have not been able to compile and run the tutorials.
3.There is a tutorial for verifying finger, which works on command prompt and prints out results of matching. However, this tutorial code reads templates or images. What I want is this:
a: register a finger and create template
b: ask the user to put his/her finger on scanner
c: get the image from scanner and then match with template and print the results

This may sound too simple, but can someone guide me?

Thanks,
Sudhir

Hello,

"simple-finger-sample" shows the same functionality, which you require. So basically you need to combine this sample into your application.
Maven is chosen as the most comfortable configuration tool for us, but you can avoid using it and create your own projects without Maven usage.
If you want to compile tutorials provided in the SDK, then you can run "mvn clean install" command in the SDK "tutorials" folder.
If you want to use it in the Eclipse, then import them as Maven project from the "Tutorials" folder, and define working directory to point to SDK "bin\win32_x86" or "bin\win64_x64" folder. Also you will need to define additional parameters for each tutorial.

What exactly is MegaMatcher?

PostPosted: Jun 26, 2013 12:07
by jacobg
We currently use the VeriLook product on server for face recognition. We are starting to see scalability issues when trying to identify a face from a set of hundreds of templates, and so I'm looking into a solution to eventually be able to quickly identify a face from a set of thousands of templates. I see MegaMatcher advertises such robust performance. It's not clear to me though exactly what MegaMatcher does. It seems like it's more than just a wrapper over VeriLook that dispatches identify tasks to a cluster of servers, because the per-server performance is spec'ed to be much faster than VeriLook. Can you explain please exactly what is different from VeriLook at a lower level? Does it create indexable information that you can store in a database and query for to narrow the possible matching results, such as gender, face features or other types of attributes? Is the accuracy as good as VeriLook?

Thanks a lot.

Admin: merged to "Matching questions"

Re: What exactly is MegaMatcher?

PostPosted: Jun 27, 2013 07:56
by Martynas
Hello,

jacobg wrote:Can you explain please exactly what is different from VeriLook at a lower level?


The face matching algorithm is exactly the same in VeriLook and MegaMatcher. The difference is that the VeriLook provides the Face Matcher license, while the MegaMatcher provides the Face Fast Matcher license, which enables faster matching mode of faces if to compare it to Face Matcher license.
Also the MegaMatcher Extended SDK includes the possibility of the clustered solution, when several cores of the processor or several processors can be used. in such case the database is divided in equal to amount of the NClusterNode amount parts. And each NClusterNode uses only its own part of the database, which is smaller if to compare it with the whole database, so matching is processed faster.

jacobg wrote:I'm looking into a solution to eventually be able to quickly identify a face from a set of thousands of templates.


It is not recommended to use only faces for identification if the database exceeds the 2000 faces records, as it will result in big false acceptance rate. In cases when there more than 2000 records, we recommend to consider other biometric modality or combination of them, i.e. fingers, fingers + faces, irises, irises + faces.

Fingerprint Matching Speed

PostPosted: Jul 03, 2013 06:57
by Namezz
Hi, now I"m using Verifinger 6.6 and try to set fingerprint matching speed.

I set matching speed to be "High" but matcher returns speed to "Medium".
In case of speed of matcher are medium and low, it uses type of speed correctly.

How do I set speed of matcher to high?

Admin: merged to "Matching questions"

Fingerprint Matching Speed

PostPosted: Jul 03, 2013 07:03
by Namezz
Hi, now I"m using a verifinger 6.6 SDK and try to set speed of matcher.

I set speed of matcher to high but it return speed to medium.
In case of speed medium and low, speed of matcher is set correctly.

How do I set speed of matcher to high?

Code: Select all
NMatcher matcher = null;
matcher = new NMatcher();
matcher.FingersMatchingSpeed = NMatchingSpeed.High;
Console.WriteLine(matcher.FingersMatchingSpeed.ToString());


it returns medium.

Re: Fingerprint Matching Speed

PostPosted: Jul 04, 2013 06:43
by Martynas
Namezz wrote:Hi, now I"m using a verifinger 6.6 SDK and try to set speed of matcher.

I set speed of matcher to high but it return speed to medium.
In case of speed medium and low, speed of matcher is set correctly.

How do I set speed of matcher to high?

Code: Select all
NMatcher matcher = null;
matcher = new NMatcher();
matcher.FingersMatchingSpeed = NMatchingSpeed.High;
Console.WriteLine(matcher.FingersMatchingSpeed.ToString());


it returns medium.

Hello,

it is a correct behavior, when the High speed mode is set, it sets the Medium speed mode.

Matching threshold and Matching speed setting

PostPosted: Jul 09, 2013 07:06
by jude
Hi, I'm testing a face recognition using Neurotec Biometric 4.4 SDK Trial.

When I setting the parameters which are NMP_FACES_MATCHING_SPEED and NMP_FACES_MATCHING_THRESHOLD(or NMP_FACES_MATCHING_THRESHOLD_NEW),
the parameter setting functions, NObjectSetParameterEx and NObjectSetParameterWithPartEx, always return "Invalid parameter error".

Other parameter setting (NLEP_FACE_QUALITY_THRESHOLD, NLEP_MAX_ROLL_ANGLE_DEVIATION, NLEP_MAX_YAW_ANGLE_DEVIATION...) works well, but only those two parameter returns error.

I tried all of parameters who are similar with them,

NMP_FACES_MATCHING_SPEED,
NMP_FACES_MATCHING_THRESHOLD,
NMP_FACES_MATCHING_THRESHOLD_NEW,
NMP_MATCHING_THRESHOLD,
NLMP_MATCHING_THRESHOLD,
NLMP_MATCHING_SPEED,
NLMP_MATCHING_THRESHOLD,

but all failed.

I tested them with N_TYPE_INT, and N_TYPE_DOUBLE each, like this,
Code: Select all
result = NObjectSetParameterEx(extractor, NMP_FACES_MATCHING_SPEED, N_TYPE_INT, &matchingSpeed, sizeof(matchingSpeed));
or,
result = NObjectSetParameterWithPartEx(extractor, NM_PART_NONE, NMP_FACES_MATCHING_THRESHOLD_NEW, N_TYPE_DOUBLE, &matchingThre, sizeof(matchingThre));
...

any problem with these code?

and what is the parameter type of matching threshold? I cannot find it.

Thanks.

Admin: merged to "Matching questions"

Re: Matching threshold and Matching speed setting

PostPosted: Jul 09, 2013 07:33
by Martynas
Hello,

the beginning of the constant name states to which object it can be applied. NMP means NMatcher Paramater. NLMP means NLMatcher Parameter. NLEP means NLExtractor Prameter. According to your provide source code snippet you are trying to set NMatcher parameters to NLExtractor and this is the reason why you get this error. you should apply these parameters to NMatcher object.

jude wrote:and what is the parameter type of matching threshold?


N_TYPE_INT

Is there diffrence between NMatcher identify and verify?

PostPosted: Sep 12, 2013 22:47
by snijsure
Perhaps a basic question on identify v/s verify.

Say I have fingerprint image, for which I generate a template using the extractor (which is a template?), say its templateA

Now I have stored the templates in my database and I want to find out if this fingerprint matches any of the records from my db.

It seems like one can do NMatcher.identifyStart(templateA) and then loop through the data from db and do NMatcher.identifyNext(templateX).

Or one could do NMatcher.verify(templateA,templateN) and declare that templateN is same as templateA if these methods return value grater than zero?

So it it valid to assume basically identifyStart, Next can be used to do 1:N matching or one can also use NMatcher.verify()?

Or am I confused regarding NMatcher methods.

-Subodh

Admin: merged to "Matching questions"

Re: Is there diffrence between NMatcher identify and verify?

PostPosted: Sep 13, 2013 07:08
by Martynas
snijsure wrote:Perhaps a basic question on identify v/s verify.

Say I have fingerprint image, for which I generate a template using the extractor (which is a template?), say its templateA

Now I have stored the templates in my database and I want to find out if this fingerprint matches any of the records from my db.

It seems like one can do NMatcher.identifyStart(templateA) and then loop through the data from db and do NMatcher.identifyNext(templateX).

Or one could do NMatcher.verify(templateA,templateN) and declare that templateN is same as templateA if these methods return value grater than zero?

So it it valid to assume basically identifyStart, Next can be used to do 1:N matching or one can also use NMatcher.verify()?

Or am I confused regarding NMatcher methods.

-Subodh

Hello,

Verify method is intended to be used for the 1:1 matching, but you can use it for the identification too. then you will need to pass the probe template as first argument and the template from the database as second argument in this method. And you will need to use the Verify method as many times as there are records in the database.
Basically the Verify method uses same IdentifyStart, IdentifyNext and IdentifyEnd methods inside it. Identify method initializations is quite expensive in means of time. So when the 1:N (identification is needed), it is recommended to use sequence of IdentifyStart, IdentifyNext and IdentifyEnd methods, where IdentifyStart will be called once and the templates will be looped in IdentifyNext method. This will increase the matching speed, as IdentifyStart will not be called each time the match is done.

How to convert NBuffer to NTemplate

PostPosted: Oct 23, 2013 08:45
by Revelation
Hi

After capturing an image from a Fingerprint scanner, I want to immediately convert it to a Template and then scan the SQLite database for a match.

However, I only seem to be able to extract the image to an NBuffer object and I don't know how to convert it to an NTemplate in order to use LocalMatcher.Instance.Identify(nTemplate, db) to look for a match in the database.

Here is my code. Please could you help. Thank you.

Code: Select all
   
Dim extractor As New NFExtractor()
Dim record As NFRecord
Using image As NImage = fingerScanner.Capture()
    If image Is Nothing Then
        Console.WriteLine("Finger scanner unplugged. Exiting")
        Exit Sub
    End If

    Dim extractionStatus As NfeExtractionStatus
    Using grayscaleImage As NGrayscaleImage = image.ToGrayscale()
        If grayscaleImage.ResolutionIsAspectRatio OrElse grayscaleImage.HorzResolution < 250 OrElse grayscaleImage.VertResolution < 250 Then
            grayscaleImage.HorzResolution = 500
            grayscaleImage.VertResolution = 500
            grayscaleImage.ResolutionIsAspectRatio = False
        End If
        record = extractor.Extract(grayscaleImage, NFPosition.Unknown, NFImpressionType.LiveScanPlain, extractionStatus)
    End Using
    Dim nTemplate As NBuffer
    nTemplate = record.Save()
    Try
        Dim db As Database = WizardData.LocalDatabase
        WizardData.LocalMatchingResults = LocalMatcher.Instance.Identify(nTemplate, db)   ' <<<<< nTemplate must be NTemplate type??
        WizardData.ErrorMessage = ""
    Catch ex As Exception
        WizardData.ErrorMessage = ex.Message
    Finally
        MsgBox(WizardData.ResultMessage)
    End Try
End Using


Admin: merged to "Matching questions"

Re: How to convert NBuffer to NTemplate

PostPosted: Oct 23, 2013 09:44
by Martynas
Revelation wrote:Hi

After capturing an image from a Fingerprint scanner, I want to immediately convert it to a Template and then scan the SQLite database for a match.

However, I only seem to be able to extract the image to an NBuffer object and I don't know how to convert it to an NTemplate in order to use LocalMatcher.Instance.Identify(nTemplate, db) to look for a match in the database.

Here is my code. Please could you help. Thank you.

Code: Select all
   
Dim extractor As New NFExtractor()
Dim record As NFRecord
Using image As NImage = fingerScanner.Capture()
    If image Is Nothing Then
        Console.WriteLine("Finger scanner unplugged. Exiting")
        Exit Sub
    End If

    Dim extractionStatus As NfeExtractionStatus
    Using grayscaleImage As NGrayscaleImage = image.ToGrayscale()
        If grayscaleImage.ResolutionIsAspectRatio OrElse grayscaleImage.HorzResolution < 250 OrElse grayscaleImage.VertResolution < 250 Then
            grayscaleImage.HorzResolution = 500
            grayscaleImage.VertResolution = 500
            grayscaleImage.ResolutionIsAspectRatio = False
        End If
        record = extractor.Extract(grayscaleImage, NFPosition.Unknown, NFImpressionType.LiveScanPlain, extractionStatus)
    End Using
    Dim nTemplate As NBuffer
    nTemplate = record.Save()
    Try
        Dim db As Database = WizardData.LocalDatabase
        WizardData.LocalMatchingResults = LocalMatcher.Instance.Identify(nTemplate, db)   ' <<<<< nTemplate must be NTemplate type??
        WizardData.ErrorMessage = ""
    Catch ex As Exception
        WizardData.ErrorMessage = ex.Message
    Finally
        MsgBox(WizardData.ResultMessage)
    End Try
End Using

Hello,

NMatcher accepts NFRecords, NFTemplates and NTemplates saved as NBuffer or as byte array (byte[]).
NFExtractor.Extract method returns the NFRecord. so you can save it to NBuffer and use it with the NMatcher.
If your "LocalMatcher.Instance.Identify" function requires the NBuffer, then there should be no problems with your code.
If this function requires that it would be a NTemplate, then you can create the NTemplate by using this code below
Code: Select all
Dim nTemplate As New NTemplate(record.Save())

Re: How to convert NBuffer to NTemplate

PostPosted: Oct 23, 2013 10:08
by Revelation
Thank you, Martynas. That worked perfectly.

Advice on fingerprints falsely rejected

PostPosted: Nov 28, 2013 12:05
by monkeyhandz
Hi

We have used the Java libraries of the Neurotec Biometric 4.5 SDK to develop our own biometric server for fingerprints.

It all works fine most of the time; people can enrol their fingerprints and then do successful verifications against the enrolled fingerprints.
However we have found some people are continually Falsely Rejected; they cannot get successful matches against their enrolled finger prints.

In general we left the matching parameters at defaults, matchingThreshold is at the default value and the matchingSpeed is HIGH (we have bought the highspeed matcher license as well).
I have tried adjusting the matchingThreshold to 6 and the fingerprints do get matched then, but I'm worried about the implications of changing these values and how it might affect False Acceptance rates. Can you give any advice please?

If you want I can send you the Enrol and Verify Fingerprint templates if you want to test them.

Kind regards
David Smith

Admin: merged to "Matching questions"

Re: Advice on fingerprints falsely rejected

PostPosted: Nov 28, 2013 14:13
by vaidasz
monkeyhandz wrote:Hi

We have used the Java libraries of the Neurotec Biometric 4.5 SDK to develop our own biometric server for fingerprints.

It all works fine most of the time; people can enrol their fingerprints and then do successful verifications against the enrolled fingerprints.
However we have found some people are continually Falsely Rejected; they cannot get successful matches against their enrolled finger prints.

In general we left the matching parameters at defaults, matchingThreshold is at the default value and the matchingSpeed is HIGH (we have bought the highspeed matcher license as well).
I have tried adjusting the matchingThreshold to 6 and the fingerprints do get matched then, but I'm worried about the implications of changing these values and how it might affect False Acceptance rates. Can you give any advice please?

If you want I can send you the Enrol and Verify Fingerprint templates if you want to test them.

Kind regards
David Smith


Hello David,

As it is described in documentation, section "Matching Threshold and FAR/FRR":
"Biometric features matching algorithm provides similarity score as a result. The higher is score, the higher is probability that features collections are obtained from the same person.
The higher the threshold value, the more similar feature collections will have to be to yield positive result during matching.
Matching threshold - the minimum score that verification and identification functions accept to assume that the compared finger fingerprints, face or iris belong to the same person.
Matching threshold is linked to false acceptance rate (FAR, different subjects erroneously accepted as of the same) of matching algorithm. The higher is threshold, the lower is FAR and higher FRR (false rejection rate, same subjects erroneously accepted as different) and vice a versa."
Matching threshold should be set by your requirements.
Check documentation about the matching threshold, FAR and FRR.

Please write an e-mail to support@neurotechnology.com. Provide the following information with in the e-mail:
1. Diagnostics, generated using Activation Wizard and saved to file;
2. Templates that you expect to be matched;
3. Images (if you save them) that were used to generate templates in step 2;