Mime-Version: |
1.0 |
Sender: |
|
Date: |
Wed, 25 Jun 2003 10:50:09 +0000 |
Reply-To: |
|
Content-Transfer-Encoding: |
7bit |
Subject: |
|
From: |
|
Content-Type: |
text/plain |
In-Reply-To: |
|
Organization: |
IEETA, UNIVERSIDADE de AVEIRO |
Comments: |
|
Parts/Attachments: |
|
|
Many thanks for your kind replies. Yes, I mean "a priori" probabilities.
To make a decision, the Bayesian decision rule is
P(X|OBJECT)/P(X|NON-OBJECT)> lamda
Usually lamda=P(NON-OBJECT)/P(OBJECT). Here I just want to know how to
decide this lamda and the priori probabilities P(OBJECT) and
P(NON-OBJECT). For example in face detection or recognition, this can be
used to decide whether a new pattern is a face or non-face by computing
its likelihood ratios.From a training set the probabilistic model
parameters can be learnt. I am not sure whether the priori probabilities
should also be derived from the training set. For example, if in a
training set there are 3000 object patterns and 15000 non-object
patterns. In this case can we say the threshold, lamda, should be
15000/3000=5? Or can we still assume a equal priori which means lamda
should be 1?
Many Thanks.
On Wed, 2003-06-25 at 02:38, leo horseman wrote:
>
>
>
> Could you be a little more specific? Do you mean a priori
> probabilities? Are you referring to the chance probabilities of an
> object belonging to a particular class or not belonging to that
> class? What do you mean by a "training set"? Can you give an example
> of a specific problem in terms of number of objects (entities, etc.)
> to be classified; number and type of variates from which a pattern is
> derived; type of values assigned to the variates, etc.
>
> M. Childress
|
|
|