Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Wednesday, March 28, 2012

More info

In the the Log File Viewer, I select Maintenace Plan and there I see to rows
marked with a green icon and a + sign to the left... I clicked on the + to
reveal what's under the row.. then it revealed a row with an red stop icon
to the left...
On one of the rows with red stop icon, I read this error message:
Could not generate mail report.An exception occurred while executing a
Transact-SQL statement or batch.No global profile is configured. Specify a
profile name in the @.profile_name parameter.
any suggestions?Hi Jeff
It looks like it is having problems sending the email notification. Try
changing the maintenance plan to remove these. There is probably a notify
operator task that needs to be re-configured. Also check that you can send
test emails for database mail and SQL Agent mail configuration (in the
properties)
John
"Jeff" wrote:

> In the the Log File Viewer, I select Maintenace Plan and there I see to ro
ws
> marked with a green icon and a + sign to the left... I clicked on the + to
> reveal what's under the row.. then it revealed a row with an red stop icon
> to the left...
> On one of the rows with red stop icon, I read this error message:
> Could not generate mail report.An exception occurred while executing a
> Transact-SQL statement or batch.No global profile is configured. Specify a
> profile name in the @.profile_name parameter.
> any suggestions?
>
>|||I modifyed the Maintenance Plan -> selected the Maintenance plan I selected
"Modify". then I opened the "Reporting and Logging" window and unchecked
"Send report to an email recipient". Maybe the error was here... the agent
operator set as recipient was an domain account which not exist anymore
It's a long time since this domain account was removed - strange if caused
problem for sqlserver now
I saved my modifications to the maintenance plan...
How do I manually test if this actually solved the problem'
Best Regards
Jeff|||Hi Jeff
It may be that the account had been disabled but the mailbox remained until
more recently!!
At least everything works now!
John
"Jeff" wrote:

> I modifyed the Maintenance Plan -> selected the Maintenance plan I selecte
d
> "Modify". then I opened the "Reporting and Logging" window and unchecked
> "Send report to an email recipient". Maybe the error was here... the agent
> operator set as recipient was an domain account which not exist anymore
> It's a long time since this domain account was removed - strange if caused
> problem for sqlserver now
> I saved my modifications to the maintenance plan...
> How do I manually test if this actually solved the problem'
> Best Regards
> Jeff
>
>|||Right click on the maintanence plan and select Execute from the popup menu
and View History for that plan (by right clicking on it again...)
Ekrem nsoy
"Jeff" <donot@.spam.me> wrote in message
news:eDsSQBLQIHA.4584@.TK2MSFTNGP03.phx.gbl...
>I modifyed the Maintenance Plan -> selected the Maintenance plan I selected
>"Modify". then I opened the "Reporting and Logging" window and unchecked
>"Send report to an email recipient". Maybe the error was here... the agent
>operator set as recipient was an domain account which not exist anymore
> It's a long time since this domain account was removed - strange if caused
> problem for sqlserver now
> I saved my modifications to the maintenance plan...
> How do I manually test if this actually solved the problem'
> Best Regards
> Jeff
>

Monday, March 26, 2012

More DTS Help

Hello,
I'm using a DTS to make two tables of a csv passed file, I can make the transformations from the file to one table, but not for two, what I do is check if an email is valid and if it is I want to put it in one table, and if it isn't in another. Here is the code I'm using for validating:

Function Main()
'esemail is the validation function
if EsEmail(DTSSource("email")) then
DTSDestination("email") = DTSSource("email")
else
'Here i need HELP!!!!!
end if

Main = DTSTransformStat_OK
End Function

So what should I put in the else statement to transfer the email to another destination table?

Thanks,
A.Hello,

I think you work with the DataPumpTask, aren't you?
If I'm right, then you have to change the destination table in this task when going through the ELSE-path.

For details how to change the properties of a DataPumpTask refer to http://www.sqldts.com/default.aspx?213.

Hope this helps otherwise give me a hint!
Carsten

Originally posted by ArgenSQL
Hello,
I'm using a DTS to make two tables of a csv passed file, I can make the transformations from the file to one table, but not for two, what I do is check if an email is valid and if it is I want to put it in one table, and if it isn't in another. Here is the code I'm using for validating:

Function Main()
'esemail is the validation function
if EsEmail(DTSSource("email")) then
DTSDestination("email") = DTSSource("email")
else
'Here i need HELP!!!!!
end if

Main = DTSTransformStat_OK
End Function

So what should I put in the else statement to transfer the email to another destination table?

Thanks,
A.|||Tambien te puede convenir hacer dos tareas
Una para correos validos y la otra para los no.

Suerte|||Thanks CarstenK, I read that sample, it help a lot but I'm still having a problem I'm getting an error.
This is the code I'm using:

Function Main()
Dim oPkg, oDataPump

Set oPkg = DTSGlobalVariables.Parent
Set oDataPump = oPkg.Tasks("DTSStep_DTSDataPumpTask_1").CustomTask

if EsEmail(DTSSource("email")) then

oDataPump.DestinationObjectName = "dbo.ValidEmail"
DTSDestination("email") = DTSSource("email")

else

oDataPump.DestinationObjectName = "dbo.NonValidEmail"
DTSDestination("email") = DTSSource("email")

end if

Set oDataPump = Nothing
Set oPkg = Nothing

Main = DTSTransformStat_OK
End Function

but I'm getting this error "Task 'DTSStep_DTSDataPumpTask_1' not found"

In the link you gave me they explain how to get the name, I did it but still the same error, and I'm sure the name is correct (copy & paste).

Maybe I'm doing something wrong, I'm using this ActiveX code in the "Transform Data Task Properties", "transformations" tab, in the arrow connecting the source column with the destination column. I think there is where the script should go...
well... if anyone can help me with this would be great!

fhnth: hacer dos tareas no seria practico por que tendria que recorrer la tabla dos veces.

Bye,
A.|||Originally posted by ArgenSQL
Hello,
I'm using a DTS to make two tables of a csv passed file, I can make the transformations from the file to one table, but not for two, what I do is check if an email is valid and if it is I want to put it in one table, and if it isn't in another. Here is the code I'm using for validating:

Could you import to a staging table and use T-SQL to move the records into the correct destination table?|||Originally posted by RogerWilco
Could you import to a staging table and use T-SQL to move the records into the correct destination table?

I could but I don't want to do that, what I want to do is a dts package that do all the work in one (or few) steps. I hope someone with experience on that could help me...
DTS have all the functionality in the world, I want to learn how to use it so I can do things in the best way and using all the technology I have access to.

Bye,
A.|||Hello !

After reading your reply a third time, I think I do know your problem. You said
Maybe I'm doing something wrong, I'm using this ActiveX code in the "Transform Data Task Properties", "transformations" tab, in the arrow connecting the source column with the destination column.

I think you should create an ActiveXTask, that contains your code and runs before the actual data import and adjusts the properties.

I hope that brings you a step further!

Greetings,
Carsten|||I am not sure whether this can be achieved in a single DTS step (one source (a .csv file) and 2 destinations (2 diff. tables).

You can make a 2 step DTS package>
1). 1st Step:
source : .csv file
dest : 1st table
ActiveX :
If email1 = email2
Main = DTSTransformStat_OK
else
Main = DTSTransformStat_SkipRow
end if

2). 2nd Step:
source : .csv file
dest : 2nd table
ActiveX :
If email1 = email2
Main = DTSTransformStat_SkipRow
else
Main = DTSTransformStat_OK
end if|||Hi!

My thoghts are going this way:

Create an ActiveX-Task which determines whether an email is valid or not and then adjusts the appropriate destination for the following import task.
By taking this way, you can avoid processing an email twice!

Carsten|||Originally posted by CarstenK
Create an ActiveX-Task which determines whether an email is valid or not and then adjusts the appropriate destination for the following import task.

I can't make it work, I tryed everyway and still getting different errors, if I put the ActiveX code before the data pump, I don't get an error by detecting the task name, but I get an error because I ask for the email in the DTSsource and that doesn't happen until I get in the data part, so I can assign a different destination table but not commanded by the email type. I can't find a solution for this I search the web a lot but couldn't find a solution without passing two times trought the table, so I'm doing that I don't like it but I have to finish this anyway, if anyone finds the way to do this, please tell me.

Bye, Thanks all for your help.sql

More custom security questions

Still working on custom security Sad
Since I can't get a name of file (path) from within CheckAccess method it becomes somewhat useless. Is there a way to maybe overload GetPermissions or Policies methods? Or maybe CreateReport method so that I can include some custom code there? Is there an example of something like that?

I wish the catalog tree was transparent to Authentication extension, I don't see a point in acl for a custom extension, all I want is names and I can build on top of that. Something along the lines -

Code Snippet

public bool CheckAccess(...file...)
{
string[] permissions ;
permissions = server.GetPermissions(file);
...


Maybe I am missing something simple and I can tie everything to a security descriptor but I don't see how I can if there is no information such as name, date, modified by name and so on. All we get is principal name which is not very useful since I don't use built-in security names.

Thanks
So anybody implemented CustomAccess not totally based on acecollection?
|||bumpsql

Friday, March 23, 2012

month ('jan','feb',...) string to date conversion fails

I use the derived column to convert a string date from a flat file like this: "Jan 02 2005" into a datetime.
I have seen in the forum to use: (DT_DATE)(SUBSTRING(mydate,5,2) + "-" + SUBSTRING(mydate,1,3) + "-" + SUBSTRING(mydate,8,4))
However, even if it produces a string like '02-Jan-2005', the following cast to dt_date fails.
I have also tried inverting month and day, year/month/day but all with the same result:

Derived Column [73]]

Error: The component "Derived Column" failed because error code

0xC0049064 occurred, and the error row disposition on "output column"...

I think the cast fails bacause of the month format. Therefore the only solution would be to code in in a lookup table Jan, 01 | Feb, 02 |... ?

srem wrote:

I use the derived column to convert a string date from a flat file like this: "Jan 02 2005" into a datetime.
I have seen in the forum to use: (DT_DATE)(SUBSTRING(mydate,5,2) + "-" + SUBSTRING(mydate,1,3) + "-" + SUBSTRING(mydate,8,4))
However, even if it produces a string like '02-Jan-2005', the following cast to dt_date fails.
I have also tried inverting month and day, year/month/day but all with the same result:

Derived Column [73]] Error: The component "Derived Column" failed because error code 0xC0049064 occurred, and the error row disposition on "output column"...

I think the cast fails bacause of the month format. Therefore the only solution would be to code in in a lookup table Jan, 01 | Feb, 02 |... ?

Its not an ideal situation but if you want to achieve this in a single derived column component what you will need to do is build some nested conditional operators. Its a bit of a fraught process to begin with but once you get the hang of it it isn't too bad.

I hope Microsoft give us the ability to extend the expression language in the next version by allowing us to build our own custom expression functions.

-Jamie

|||I suppose you could build a quick lookup table using a simple stored procedure. Then all you'd have to do is join on that "Jan 02 2005" column to get you a real date field in return.

Might even work better than doing all of the CASTs and conditional logic tests.|||

If you know exactly what the month strngs are, and they are all 3 chars long, you could do something like the following to get month number:

(FINDSTRING(month, "JANFEBMARAPR....DEC",1) + 2 ) / 3

If there are multiple options, or the lengths are irregular you could still do something similar, but it might not be worth it anymore over the nested conditionals.

Monsterlog file - short term solution?

Hi,
Ive inherited admin of a database of which I was a user, so I'm new to this.
The log file has grown to an enormous size. This has happened in the past
(only a couple of times in a year) and the db admin has fixed it. From
reading other posts I think I will set the recovery to SIMPLE (it is a data
warehouse). (However whatever scheme was in put in place by the admin
normally worked OK, even with recovery set to FULL, so I am not sure what has
changed.)
So, I still have the problem right now of the huge log file and a database
that will not respond. Can I fix the problem without backing up the log file
(to be honest the log file is of no use)? There is not enough space on disk
media to backup the log file. Is there a way of "fooling" SQL Server into
thinking it has been backed up?
Any help appreciated. And yes I will read up on BOL
Les R
Sounds like you have your recovery plan set to FULL. If you
truly do not need the contents of the log file then you can
truncate it
backup log <your db> with truncate_only
go
This will delete the inactive portion of the log
You can then issue a DBCC SHRINKFILE command
to reclaim disk space the log may have requested from
the operating system in the course of it's growth:
use <your db>
go
dbcc shrinkfile(<log name>, <size>)
go
Make sure you allocate enough initial space to the log
doesn't have to keep autogrowing (assuming it's set
to autogrow).
If you're using the FULL recovery model, also set it
to SIMPLE.
Before you do anything, make sure that you in fact
do not need the log info. And read up on the
backup log and dbcc shrinkfile before you run
them so you understand them.
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:453CEC6E-4E5F-4FDB-BE4E-20AE8CB91C16@.microsoft.com...
> Hi,
> Ive inherited admin of a database of which I was a user, so I'm new to
this.
> The log file has grown to an enormous size. This has happened in the past
> (only a couple of times in a year) and the db admin has fixed it. From
> reading other posts I think I will set the recovery to SIMPLE (it is a
data
> warehouse). (However whatever scheme was in put in place by the admin
> normally worked OK, even with recovery set to FULL, so I am not sure what
has
> changed.)
> So, I still have the problem right now of the huge log file and a database
> that will not respond. Can I fix the problem without backing up the log
file
> (to be honest the log file is of no use)? There is not enough space on
disk
> media to backup the log file. Is there a way of "fooling" SQL Server into
> thinking it has been backed up?
> Any help appreciated. And yes I will read up on BOL
> Les R

Monsterlog file - short term solution?

Hi,
Ive inherited admin of a database of which I was a user, so I'm new to this.
The log file has grown to an enormous size. This has happened in the past
(only a couple of times in a year) and the db admin has fixed it. From
reading other posts I think I will set the recovery to SIMPLE (it is a data
warehouse). (However whatever scheme was in put in place by the admin
normally worked OK, even with recovery set to FULL, so I am not sure what ha
s
changed.)
So, I still have the problem right now of the huge log file and a database
that will not respond. Can I fix the problem without backing up the log fil
e
(to be honest the log file is of no use)? There is not enough space on disk
media to backup the log file. Is there a way of "fooling" SQL Server into
thinking it has been backed up?
Any help appreciated. And yes I will read up on BOL
Les RSounds like you have your recovery plan set to FULL. If you
truly do not need the contents of the log file then you can
truncate it
backup log <your db> with truncate_only
go
This will delete the inactive portion of the log
You can then issue a DBCC SHRINKFILE command
to reclaim disk space the log may have requested from
the operating system in the course of it's growth:
use <your db>
go
dbcc shrinkfile(<log name>, <size> )
go
Make sure you allocate enough initial space to the log
doesn't have to keep autogrowing (assuming it's set
to autogrow).
If you're using the FULL recovery model, also set it
to SIMPLE.
Before you do anything, make sure that you in fact
do not need the log info. And read up on the
backup log and dbcc shrinkfile before you run
them so you understand them.
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:453CEC6E-4E5F-4FDB-BE4E-20AE8CB91C16@.microsoft.com...
> Hi,
> Ive inherited admin of a database of which I was a user, so I'm new to
this.
> The log file has grown to an enormous size. This has happened in the past
> (only a couple of times in a year) and the db admin has fixed it. From
> reading other posts I think I will set the recovery to SIMPLE (it is a
data
> warehouse). (However whatever scheme was in put in place by the admin
> normally worked OK, even with recovery set to FULL, so I am not sure what
has
> changed.)
> So, I still have the problem right now of the huge log file and a database
> that will not respond. Can I fix the problem without backing up the log
file
> (to be honest the log file is of no use)? There is not enough space on
disk
> media to backup the log file. Is there a way of "fooling" SQL Server into
> thinking it has been backed up?
> Any help appreciated. And yes I will read up on BOL
> Les Rsql

Monsterlog file - short term solution?

Hi,
Ive inherited admin of a database of which I was a user, so I'm new to this.
The log file has grown to an enormous size. This has happened in the past
(only a couple of times in a year) and the db admin has fixed it. From
reading other posts I think I will set the recovery to SIMPLE (it is a data
warehouse). (However whatever scheme was in put in place by the admin
normally worked OK, even with recovery set to FULL, so I am not sure what has
changed.)
So, I still have the problem right now of the huge log file and a database
that will not respond. Can I fix the problem without backing up the log file
(to be honest the log file is of no use)? There is not enough space on disk
media to backup the log file. Is there a way of "fooling" SQL Server into
thinking it has been backed up?
Any help appreciated. And yes I will read up on BOL :)
Les RSounds like you have your recovery plan set to FULL. If you
truly do not need the contents of the log file then you can
truncate it
backup log <your db> with truncate_only
go
This will delete the inactive portion of the log
You can then issue a DBCC SHRINKFILE command
to reclaim disk space the log may have requested from
the operating system in the course of it's growth:
use <your db>
go
dbcc shrinkfile(<log name>, <size>)
go
Make sure you allocate enough initial space to the log
doesn't have to keep autogrowing (assuming it's set
to autogrow).
If you're using the FULL recovery model, also set it
to SIMPLE.
Before you do anything, make sure that you in fact
do not need the log info. And read up on the
backup log and dbcc shrinkfile before you run
them so you understand them.
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:453CEC6E-4E5F-4FDB-BE4E-20AE8CB91C16@.microsoft.com...
> Hi,
> Ive inherited admin of a database of which I was a user, so I'm new to
this.
> The log file has grown to an enormous size. This has happened in the past
> (only a couple of times in a year) and the db admin has fixed it. From
> reading other posts I think I will set the recovery to SIMPLE (it is a
data
> warehouse). (However whatever scheme was in put in place by the admin
> normally worked OK, even with recovery set to FULL, so I am not sure what
has
> changed.)
> So, I still have the problem right now of the huge log file and a database
> that will not respond. Can I fix the problem without backing up the log
file
> (to be honest the log file is of no use)? There is not enough space on
disk
> media to backup the log file. Is there a way of "fooling" SQL Server into
> thinking it has been backed up?
> Any help appreciated. And yes I will read up on BOL :)
> Les R

Monday, March 12, 2012

Monitoring for files

Hi,
Does anyone know of a method using File System Objects to
monitor for filenames that are date & time stamped.
The filename will always follow the form "BACS Day File [ddmmyyhhmm]"
There should only ever be ONE file present in the directory - however
because it is timestamped I am finding it difficult to
write a procedure that will monitor for it.
The various methods I have looked at don't seem to support the use
of wildcards.
Would be gratteful for any advice that is offered !!
Regards
Joe.Joe90 wrote:
> Hi,
> Does anyone know of a method using File System Objects to
> monitor for filenames that are date & time stamped.
No. However even if I did, I wouldn't use it. The FSO is an abomination.

> The filename will always follow the form "BACS Day File [ddmmyyhhmm]"
> There should only ever be ONE file present in the directory - however
> because it is timestamped I am finding it difficult to
> write a procedure that will monitor for it.
> The various methods I have looked at don't seem to support the use
> of wildcards.
Try http://vbnet.mvps.org/index.html?co...tchedfolder.htm
Regards,
Michael Cole|||Hi
Such as:
http://www.sqldts.com/default.aspx?246
Once processes archive the file, therefore any file in the input directory
can be loaded.
John
"Joe90" <matrix@.cyberdine.net> wrote in message
news:Jalde.14751$5A3.484@.newsfe4-win.ntli.net...
> Hi,
> Does anyone know of a method using File System Objects to
> monitor for filenames that are date & time stamped.
> The filename will always follow the form "BACS Day File [ddmmyyhhmm]"
> There should only ever be ONE file present in the directory - however
> because it is timestamped I am finding it difficult to
> write a procedure that will monitor for it.
> The various methods I have looked at don't seem to support the use
> of wildcards.
>
> Would be gratteful for any advice that is offered !!
> Regards
> Joe.|||On Mon, 02 May 2005 08:04:57 GMT, Joe90 <matrix@.cyberdine.net> wrote:
>Does anyone know of a method using File System Objects to
>monitor for filenames that are date & time stamped.
>The filename will always follow the form "BACS Day File [ddmmyyhhmm]"
>There should only ever be ONE file present in the directory - however
>because it is timestamped I am finding it difficult to
>write a procedure that will monitor for it.
>The various methods I have looked at don't seem to support the use
>of wildcards.
LISTING 1: Code to Capture a Command-Line Tool's Output
Set objShell = CreateObject("WScript.Shell")
Set objScriptExec = objShell.Exec("ipconfig /all")
strIpConfig = objScriptExec.StdOut.ReadAll
WScript.Echo strIpConfig
Try this with a dir and wildcard, I guess.
Good luck!
Josh|||In message <OkC06XvTFHA.2996@.TK2MSFTNGP15.phx.gbl>, Michael Cole
<noone@.hansen.com> writes
>Joe90 wrote:
>No. However even if I did, I wouldn't use it. The FSO is an abomination.
>
>Try http://vbnet.mvps.org/index.html?co...tchedfolder.htm
>
File Watcher component in .Net may work, easier to use than the API, if
using .Net of course, but the VBNet stuff is very for VB6.
Darren Green (SQL Server MVP)
DTS - http://www.sqldts.com
PASS - the definitive, global community for SQL Server professionals
http://www.sqlpass.org

Monitoring Filegroup Space

Does anyone know of a way to predict that the files in a filegroup do not
have enough space left to expand? For example, a file has 3753 MB out of a
maxsize of 4096, with a file growth of 10%. Adding 10% to 3753 will exceed
the max file size, so there is no room to expand even though technically
there is space available. There are 80 such files in the filegroup and so i
t
can be deceptive when you look at the total number of MB left.Hi Eydba,
Welcome to the MSDN newsgroup.
Regarding on the filegroup space question you mentiond, based on my
research, I'm afraid so far there is not such built-in feature which can
help detect the potential exceeding of the file size in filegroup. Also,
there is no similiar trigger or notification on this. For make the
filegroup filesize among appropriate range, you can have a look some
related performance tuning articles:
#Optimizing SQL Server Performance by using File and Filegroups
http://www.mssqlcity.com/Articles/T...leOptimTips.htm
Regards,
Steven Cheng
Microsoft Online Community Support
========================================
==========
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
==========
This posting is provided "AS IS" with no warranties, and confers no rights.|||As noted, there's no way SQL Server triggers this itself, but you can
set up monitoring around it.
Option 1: Write a stored procedure that checks this, run it from a
scheduled sql job, and monitor for failure. The info you need to do the
math is returned by sp_helpfile.
Option 2: Write a small .NET application that connects through the SMO.
Through the SMO you can retrieve the file size, growth setting, and max
size and then do some math and write to the event log or throw an alert
another way if it doesn't have room to grow.
Any way you did it, you need to run DBCC updateusage frequently for the
estimates of space used to be accurate.
-Kendra

Monitoring Disk Consumption/DB Growth

Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
manually checking it each day and comparing growth of the .bak file from my
DB Maint. Plan. Is there a better way to measure daily growth of the DB.
Also, I have 4 other drives with non-DB image files that are added daily. Is
there an automated way or Administrative Tool to monitor the decline of free
disk space on 4 logical drives? Thanks, Pancho
You can check out DataStudio from AgileInfoSoftware at
http://www.agileinfollc.com, under the performance tab it has database
performance monitoring facility.
"Pancho" <Pancho@.discussions.microsoft.com> wrote in message
news:A3A8C7B3-382D-467E-A8F0-DE80AF1FFF79@.microsoft.com...
> Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
> manually checking it each day and comparing growth of the .bak file from
> my
> DB Maint. Plan. Is there a better way to measure daily growth of the DB.
> Also, I have 4 other drives with non-DB image files that are added daily.
> Is
> there an automated way or Administrative Tool to monitor the decline of
> free
> disk space on 4 logical drives? Thanks, Pancho
|||Hi
There is a script on
http://www.sqlservercentral.com/Scri...0&categoryid=1
"SQL Server Database File Sizes" which may do what you require.
John
"Pancho" wrote:

> Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
> manually checking it each day and comparing growth of the .bak file from my
> DB Maint. Plan. Is there a better way to measure daily growth of the DB.
> Also, I have 4 other drives with non-DB image files that are added daily. Is
> there an automated way or Administrative Tool to monitor the decline of free
> disk space on 4 logical drives? Thanks, Pancho
|||Check out AgileInfoSoftware DataStudio for space monitoring.
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:25900858-65EE-41B6-9308-761E29428555@.microsoft.com...[vbcol=seagreen]
> Hi
> There is a script on
> http://www.sqlservercentral.com/Scri...0&categoryid=1
> "SQL Server Database File Sizes" which may do what you require.
> John
> "Pancho" wrote:

Monitoring Disk Consumption/DB Growth

Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
manually checking it each day and comparing growth of the .bak file from my
DB Maint. Plan. Is there a better way to measure daily growth of the DB.
Also, I have 4 other drives with non-DB image files that are added daily. I
s
there an automated way or Administrative Tool to monitor the decline of free
disk space on 4 logical drives? Thanks, PanchoYou can check out DataStudio from AgileInfoSoftware at
http://www.agileinfollc.com, under the performance tab it has database
performance monitoring facility.
"Pancho" <Pancho@.discussions.microsoft.com> wrote in message
news:A3A8C7B3-382D-467E-A8F0-DE80AF1FFF79@.microsoft.com...
> Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
> manually checking it each day and comparing growth of the .bak file from
> my
> DB Maint. Plan. Is there a better way to measure daily growth of the DB.
> Also, I have 4 other drives with non-DB image files that are added daily.
> Is
> there an automated way or Administrative Tool to monitor the decline of
> free
> disk space on 4 logical drives? Thanks, Pancho|||Hi
There is a script on
http://www.sqlservercentral.com/Scr...20&categoryid=1
"SQL Server Database File Sizes" which may do what you require.
John
"Pancho" wrote:

> Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
> manually checking it each day and comparing growth of the .bak file from m
y
> DB Maint. Plan. Is there a better way to measure daily growth of the DB.
> Also, I have 4 other drives with non-DB image files that are added daily.
Is
> there an automated way or Administrative Tool to monitor the decline of fr
ee
> disk space on 4 logical drives? Thanks, Pancho|||Check out AgileInfoSoftware DataStudio for space monitoring.
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:25900858-65EE-41B6-9308-761E29428555@.microsoft.com...[vbcol=seagreen]
> Hi
> There is a script on
> http://www.sqlservercentral.com/Scr...20&categoryid=1
> "SQL Server Database File Sizes" which may do what you require.
> John
> "Pancho" wrote:
>

Monitoring Disk Consumption/DB Growth

Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
manually checking it each day and comparing growth of the .bak file from my
DB Maint. Plan. Is there a better way to measure daily growth of the DB.
Also, I have 4 other drives with non-DB image files that are added daily. Is
there an automated way or Administrative Tool to monitor the decline of free
disk space on 4 logical drives? Thanks, PanchoYou can check out DataStudio from AgileInfoSoftware at
http://www.agileinfollc.com, under the performance tab it has database
performance monitoring facility.
"Pancho" <Pancho@.discussions.microsoft.com> wrote in message
news:A3A8C7B3-382D-467E-A8F0-DE80AF1FFF79@.microsoft.com...
> Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
> manually checking it each day and comparing growth of the .bak file from
> my
> DB Maint. Plan. Is there a better way to measure daily growth of the DB.
> Also, I have 4 other drives with non-DB image files that are added daily.
> Is
> there an automated way or Administrative Tool to monitor the decline of
> free
> disk space on 4 logical drives? Thanks, Pancho|||Hi
There is a script on
http://www.sqlservercentral.com/Scripts/listscripts.asp?categorynm=Maintenance%20and%20Management%20&categoryid=1
"SQL Server Database File Sizes" which may do what you require.
John
"Pancho" wrote:
> Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
> manually checking it each day and comparing growth of the .bak file from my
> DB Maint. Plan. Is there a better way to measure daily growth of the DB.
> Also, I have 4 other drives with non-DB image files that are added daily. Is
> there an automated way or Administrative Tool to monitor the decline of free
> disk space on 4 logical drives? Thanks, Pancho|||Check out AgileInfoSoftware DataStudio for space monitoring.
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:25900858-65EE-41B6-9308-761E29428555@.microsoft.com...
> Hi
> There is a script on
> http://www.sqlservercentral.com/Scripts/listscripts.asp?categorynm=Maintenance%20and%20Management%20&categoryid=1
> "SQL Server Database File Sizes" which may do what you require.
> John
> "Pancho" wrote:
>> Hello, I have a 125GB SQL DB whose growth I need to monitor. I've been
>> manually checking it each day and comparing growth of the .bak file from
>> my
>> DB Maint. Plan. Is there a better way to measure daily growth of the DB.
>> Also, I have 4 other drives with non-DB image files that are added daily.
>> Is
>> there an automated way or Administrative Tool to monitor the decline of
>> free
>> disk space on 4 logical drives? Thanks, Pancho

Friday, March 9, 2012

Monitoring Data file size

I am looking to automate monitoring space used for each file in each
database on a SQL Server 2000. Does anybody have any SQL Scripts to do
this or to find the space used?I'm not aware of a good/easy way from TSQL. You can use sp_spaceused to
find out how much space is available in the database as a whole - since
MSSQL aims to use each file more or less evenly, that might be good
enough to give you an idea.

Using SQLDMO, however, the DBFile object has a SpaceAvailableInMB
property, which will tell you exactly how much space is free. Behind
the scenes, it calls DBCC SHOWFILESTATS, which is undocumented.

In any case, since MSSQL aims to use the files roughly equally, it's
often more important to know how much free space is in the database as
a whole (sp_spaceused), or in the transaction log (DBCC SQLPERF).

Simon

Monitoring a folder for a file - start ssis

Hi

I want my package to monitor a directory for a file and when it detects one it will start processing the file. Any help would be appreciated

Regards

Ants

The only (and easy) idea that comes to my mind is to schedule the package to run n times a day and to include conditional logic in control flow based in the existence of files on the folder.

Rafael Salas

|||

You can use the WMI task to do this. There is a sample in BOL.

Or, you can download the fileWatcher Component. http://www.sqlis.com/default.aspx?23

|||WMI is a good option here and the WMI Watcher Task will help you to achieve the same.

I remember reading somewhere that the memory used by the package will

be released only when the package is stopped, so in your case if the

package is going to be waiting for the file (package in run state), we

need to consider the side effects of memory.

Thanks,

S Suresh

Monitoring a folder

Hi everyone,
I need to monitor a folder to and write in a table the filename and datetime
of every file when it is added. Is it possible to do this in sql server or I
have to write a program to do this?
Thanks..
_Mario_Mario
You will be better of writing some program (.NET)
"Mario" <mariomario75@.yahoo.com> wrote in message
news:Obp4N9kUGHA.2444@.TK2MSFTNGP14.phx.gbl...
> Hi everyone,
> I need to monitor a folder to and write in a table the filename and
> datetime of every file when it is added. Is it possible to do this in sql
> server or I have to write a program to do this?
> Thanks..
> _Mario_
>|||I think that it's possible do it using Sql but at the outset seems more
useful and easier take advantatge of VBScript
--
Current location: Alicante (ES)
"Mario" wrote:

> Hi everyone,
> I need to monitor a folder to and write in a table the filename and dateti
me
> of every file when it is added. Is it possible to do this in sql server or
I
> have to write a program to do this?
> Thanks..
> _Mario_
>
>|||Do you mean a scheduled dts?
Mario
"Enric" <vtam13@.terra.es.(donotspam)> ha scritto nel messaggio
news:B542903E-2A4A-42D4-A733-EBC43C146E33@.microsoft.com...
>I think that it's possible do it using Sql but at the outset seems more
> useful and easier take advantatge of VBScript
> --
> Current location: Alicante (ES)
> "Mario" wrote:
>|||yeah,i was walking about dts.
We have a service which is running all day long checking up if there are
files in a folder list and if so launch a DTS with these files. But that
service is built with VB 6.0 and FSO
--
Please post DDL, DCL and DML statements as well as any error message in
order to understand better your request. It''s hard to provide information
without seeing the code. location: Alicante (ES)
"Mario" wrote:

> Do you mean a scheduled dts?
> Mario
>
> "Enric" <vtam13@.terra.es.(donotspam)> ha scritto nel messaggio
> news:B542903E-2A4A-42D4-A733-EBC43C146E33@.microsoft.com...
>
>|||The best approach is to do this in front end application
Madhivanan

Wednesday, March 7, 2012

Monitor SQL data/log file % used

Is there a way to monitor the % (or size) of the data file used in a SQL
2005 database? I can use the Performance Monitor to track the log file (Log
file(s) used size (kb)), but I don't see the same option for the data file.
Am I jut missing something?
ThanksQuery select size from sys.database_files in SQL 2005 or select size from
sysfiles in SQL 2000.
--
Hilary Cotter
Director of Text Mining and Database Strategy
RelevantNOISE.Com - Dedicated to mining blogs for business intelligence.
This posting is my own and doesn't necessarily represent RelevantNoise's
positions, strategies or opinions.
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Tim Kelley" <tkelley@.company.com> wrote in message
news:O5Tb7pe0GHA.1040@.TK2MSFTNGP06.phx.gbl...
> Is there a way to monitor the % (or size) of the data file used in a SQL
> 2005 database? I can use the Performance Monitor to track the log file
> (Log file(s) used size (kb)), but I don't see the same option for the data
> file. Am I jut missing something?
> Thanks
>

Monitor SQL data/log file % used

Is there a way to monitor the % (or size) of the data file used in a SQL
2005 database? I can use the Performance Monitor to track the log file (Log
file(s) used size (kb)), but I don't see the same option for the data file.
Am I jut missing something?
ThanksQuery select size from sys.database_files in SQL 2005 or select size from
sysfiles in SQL 2000.
--
Hilary Cotter
Director of Text Mining and Database Strategy
RelevantNOISE.Com - Dedicated to mining blogs for business intelligence.
This posting is my own and doesn't necessarily represent RelevantNoise's
positions, strategies or opinions.
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Tim Kelley" <tkelley@.company.com> wrote in message
news:O5Tb7pe0GHA.1040@.TK2MSFTNGP06.phx.gbl...
> Is there a way to monitor the % (or size) of the data file used in a SQL
> 2005 database? I can use the Performance Monitor to track the log file
> (Log file(s) used size (kb)), but I don't see the same option for the data
> file. Am I jut missing something?
> Thanks
>

Saturday, February 25, 2012

Monitor Filegrowth

Hi All
With SQL Server 2005 what is the best way to be alerted when autogrowth on a
file occurs?
ThanksHello use system monitor to do this. See the below URL for more details:-
http://support.microsoft.com/?kbid=299921
Thanks
Hari
"David" <David@.discussions.microsoft.com> wrote in message
news:BDFD49B2-D639-4573-885A-EC64AFE56153@.microsoft.com...
> Hi All
> With SQL Server 2005 what is the best way to be alerted when autogrowth on
> a
> file occurs?
> Thanks|||David
You can run SQL Server Profiler to monitor it
"David" <David@.discussions.microsoft.com> wrote in message
news:BDFD49B2-D639-4573-885A-EC64AFE56153@.microsoft.com...
> Hi All
> With SQL Server 2005 what is the best way to be alerted when autogrowth on
> a
> file occurs?
> Thanks|||I believe the default trace in SQL 2005 already monitors for this event. You
can poll the trace files periodically to see if this event has occurred.
--
Andrew J. Kelly SQL MVP
"David" <David@.discussions.microsoft.com> wrote in message
news:BDFD49B2-D639-4573-885A-EC64AFE56153@.microsoft.com...
> Hi All
> With SQL Server 2005 what is the best way to be alerted when autogrowth on
> a
> file occurs?
> Thanks|||The default trace does monitor this however it does not alert.
Events in the Default Trace:
EventID(event class), Event Name, Description
18
Audit Server Starts and Stops
Occurs when the SQL Server service state is modified.
20
Audit Login Failed
Indicates that a login attempt to SQL Server from a client failed.
22
ErrorLog
Indicates that error events have been logged in the SQL Server error
log.
46
Object:Created
Indicates that an object has been created, such as for CREATE INDEX,
CREATE TABLE, and CREATE DATABASE statements.
47
Object:Deleted
Indicates that an object has been deleted, such as in DROP INDEX and
DROP TABLE statements.
55
Hash Warning
Indicates that a hashing operation (for example, hash join, hash
aggregate, hash union, and hash distinct) that is not processing on a
buffer partition has reverted to an alternate plan. This can occur
because of recursion depth, data skew, trace flags, or bit counting.
69
Sort Warnings
Indicates sort operations that do not fit into memory. Does not
include sort operations involving the creating of indexes; only sort
operations within a query (such as an ORDER BY clause used in a SELECT
statement).
79
Missing Column Statistics
Column statistics that could have been useful for the optimizer are
not available
80
Missing Join Predicate
Query that has no join predicate is being executed. This could result
in a long-running query.
81
Server Memory Change
SQL Server memory usage has increased or decreased by either 1
megabyte (MB) or 5 percent of the maximum server memory, whichever is
greater.
92
Data File Auto Grow
Indicates that a data file was extended automatically by the server.
93
Log File Auto Grow
Indicates that a data file was extended automatically by the server
94
Data File Auto Shrink
Indicates that a data file was shrunk automatically by the server.
95
Log File Auto Shrink
Indicates that a log file was shrunk automatically by the server.
102
Audit Statement GDR Event
Occurs every time a GRANT, DENY, REVOKE for a statement permission is
issued by any user in SQL Server.
103
Audit Object GDR Event
Occurs every time a GRANT, DENY, REVOKE for an object permission is
issued by any user in SQL Server.
104
Audit AddLogin Event
Occurs when a SQL Server login is added or removed; for sp_addlogin
and sp_droplogin.
105
Audit Login GDR Event
Occurs when a Windows login right is added or removed; for
sp_grantlogin, sp_revokelogin, and sp_denylogin.
106
Audit Login Change Property Event
Occurs when a property of a login, except passwords, is modified; for
sp_defaultdb and sp_defaultlanguage.
108
Audit Add Login to Server Role Event
Occurs when a login is added or removed from a fixed server role; for
sp_addsrvrolemember, and sp_dropsrvrolemember.
109
Audit Add DB User Event
Occurs when a login is added or removed as a database user (Windows
or SQL Server) to a database; for sp_grantdbaccess, sp_revokedbaccess,
sp_adduser, and sp_dropuser.
110
Audit Add Member to DB Role Event
Occurs when a login is added or removed as a database user (fixed or
user-defined) to a database; for sp_addrolemember, sp_droprolemember,
and sp_changegroup.
111
Audit Add Role Event
Occurs when a login is added or removed as a database user to a
database; for sp_addrole and sp_droprole.
115
Audit Backup/Restore Event
Occurs when a BACKUP or RESTORE command is issued.
116
Audit DBCC Event
Occurs when DBCC commands are issued.
152
Audit Change Database Owner
Occurs when ALTER AUTHORIZATION is used to change the owner of a
database and permissions are checked to do that.
153
Audit Schema Object Take Ownership Event
Occurs when ALTER AUTHORIZATION is used to assign an owner to an
object and permissions are checked to do that.
155
FT:Crawl Started
Occurs when a full-text crawl (population) starts. Use to check if a
crawl request is picked up by worker tasks.
156
FT:Crawl Stopped
Occurs when a full-text crawl (population) stops. Stops occur when a
crawl completes successfully or when a fatal error occurs.
157
FT:Crawl Aborted
Occurs when an exception is encountered during a full-text crawl.
Usually causes the full-text crawl to stop.
164
Object:Altered
Occurs when a database object is altered.
167
Database Mirroring State Change
Occurs when the state of a mirrored database changes.

Monitor database file size

Is there a best practice for monitoring how much space is free in a database
file? I can get this info for the log file using the performance monitor
and send me a message when the free space is low. I don't want to have to
manually check this everyday, so if there is an automated means that would
be great.
Thanks.Hi,
Take a look into this article, will help you to automate to db space
monitoring.
http://www.sql-server-performance.c...with_dts_05.asp
Thanks
Hari
SQL Server MVP
"Tim Kelley" wrote:

> Is there a best practice for monitoring how much space is free in a databa
se
> file? I can get this info for the log file using the performance monitor
> and send me a message when the free space is low. I don't want to have to
> manually check this everyday, so if there is an automated means that would
> be great.
> Thanks.
>
>

Monitor database file size

Is there a best practice for monitoring how much space is free in a database
file? I can get this info for the log file using the performance monitor
and send me a message when the free space is low. I don't want to have to
manually check this everyday, so if there is an automated means that would
be great.
Thanks.Hi,
Take a look into this article, will help you to automate to db space
monitoring.
http://www.sql-server-performance.com/nn_monitor_with_dts_05.asp
Thanks
Hari
SQL Server MVP
"Tim Kelley" wrote:
> Is there a best practice for monitoring how much space is free in a database
> file? I can get this info for the log file using the performance monitor
> and send me a message when the free space is low. I don't want to have to
> manually check this everyday, so if there is an automated means that would
> be great.
> Thanks.
>
>